search
for
 About Bioline  All Journals  Testimonials  Membership  News


Revista Colombia Médica
Universidad del Valle - Facultad de Salud
ISSN: 0120-8322 EISSN: 1657-9534
Vol. 41, Num. 1, 2010, pp. 85-97
Effectiveness of health promotion and public health interventions: Lessons from Latin American Cases1

Revista Colombia Médica, Vol. 41, No. 1, 2010, pp. 85-97

Effectiveness of health promotion and public health interventions: Lessons from Latin American Cases1

Efectividad de intervenciones en promoción de la salud y salud pública: Lecciones de casos latinoamericanos

Ligia de Salazar, PhD2

1This article incorporates part of the outcomes from the Latin American program for health evaluation and health promotion developed by the CEDETES-Universidad del Valle with financial support from the Centers for Disease Control and Prevention, CDC, from the United States of America through the Cooperative Agreement CDC-CEDETES-Universidad del Valle, 5 U13 DP000618-03/3 U13 DP000618-03W1.

2Director, Center for the Development and Evaluation of Public-Health Policies and Technology, CEDETES, School of Health, Universidad del Valle, Full Professor in the School of Public Health, Universidad del Valle, Cali, Colombia. e-mail: cedetes@cedetes.org    lsalazar@emcali.net.co

Received for publication December 16 th, 2009
Accepted for publication January 12th, 2010

Code Number:rc10013

SUMMARY

The issue of evidence of effectiveness in public health and health promotion has occupied the attention of academics, technicians, and politicians who recognize not only its importance but the challenges that researchers must confront to obtain reliable and useful information to base public policy decisions and investment of resources. Despite the emergence of Latin American initiatives aimed at building the capacity to obtain such evidence of effectiveness, the practice is still incipient in the region and there are few relevant publications. This article is presented as a contribution and stimulus to further motivate the ongoing search for information and knowledge, concerning the relevance and scope of the practice of public health and health promotion to address adverse health conditions. For this purpose, a review of the literature was carried out, along with the compilation, documentation, and analysis of various cases of Latin American evaluations. From the product of this analysis, proposals are presented for strengthening the theoretical and evaluative practices in Latin America.

Keywords: Effectiveness evaluation; Health promotion; Public health evidence.

RESUMEN

El tema de evidencias de efectividad en promoción de la salud y salud pública ha ocupado la atención de académicos, técnicos y políticos, quienes reconocen no sólo su importancia, sino los desafíos que hay que enfrentar para obtener información confiable y útil que fundamente decisiones en política pública e inversión de recursos. Pese a diversas iniciativas latinoamericanas orientadas a la construcción de capacidad para obtener dichas evidencias de efectividad, esta práctica es aún incipiente en la región y son escasas las publicaciones al respecto. Este artículo surge como una contribución y una provocación para motivar la búsqueda permanente de información y conocimiento sobre la pertinencia y alcance de la práctica de la salud pública y la promoción de la salud, la comprensión de los procesos de implementación de las políticas y programas y la valoración de los resultados. Para esto, se ha realizado una revisión de literatura, así como recopilación, documentación y análisis de diversos casos de evaluación latinoamericanos. Como producto del análisis se presentan propuestas para el fortalecimiento de la teoría y la práctica evaluativa en Latinoamérica.

Palabras clave:  Efectividad; Evaluación; Promoción de la salud; Evidencias en salud pública.

FACTS, DEBATES, AND DEVELOPMENTS REGARDING THE PRACTICE AND EVALUATION IN PUBLIC HEALTH AND HEALTH PROMOTION IN LATIN AMERICA

The debate concerning evaluation and what constitutes evidence of effectiveness for public health interventions from the perspective of health promotion principles has been enriched by contributions from various scholars on the subject, from advocates for rethinking the subjects and objects of evaluation. Additionally, it has received input from indicators for assessing the success of interventions, from methodological approaches for responding to evaluation questions, from criteria for judging the validity and reliability of the evidence, as well as the kind of information that decision makers and primary data users must receive1-6.

The complexity of population-based interventions, such as those on public health with a focus on health promotion, as well as those on systems in which they are implemented, coupled with their participatory, multi-strategic, and multi-sectorial natures imply the presence of challenges when assessing their effectiveness and establishing associations between interventions and outcomes. An example of these limitations is the ongoing questioning of the relevance and utility of applying traditional epidemiological designs, whose validity depends on both scientific rigor and assumptions that require analytic studies4,7.

Therefore, for public health evaluation, the need to develop approaches and suitable evaluation methods to identify, understand, and assess processes and outcomes of interventions from political, social, and economic dimensions are highlighted, as well as their contribution to equity and quality of life of the populations, i.e., from the perspective of health promotion. Hence, beyond etiological explanations obtained in the presence of ideal or controlled situations, the evaluation of public health initiatives from the perspective of health promotion must produce information about the feasibility and sustainability of the process of social and political change, about the progress or intermediate results and their effectiveness as evidenced through meeting the objectives of the initiative.

As to the nature of public-health initiatives from the perspective of health promotion, it has been recognized that they expand the traditional view by encompassing the complexity of social change, not merely acting on the problem of unmedicalization and reorientation of health services and practices, and are making inroads in the field of development and local empowerment in the defense of public policy and in a more efficient and just national development, acting directly on the social determinants of health; in other words, intervening not only in the proximal but distal causes of health status in populations2.

One of the main facts that have generated broad debate in Latin America is that health promotion in this region has been instituted on the basis of foreign models, which do not necessarily take into account the needs and social and political systems of our countries. Therefore, it is necessary to identify the essential components of its definition, theory, and practice within the regional context taking into consideration our own cultural, social, political, and economic characteristics. This makes the assessment of effectiveness in public health practice and health promotion highly dependent on context and, therefore, the factors that influence the implementation processes and that determine its viability and sustainability are also subject to analysis, understanding, and evaluation. In this sense, evaluation indicators traditionally used on morbidity, mortality, and risk factors do not necessarily fully capture these elements of change process or the impact on the health of the population. Therefore, the conceptual clarification and historical roots that account for the nature and extent of policies or programs being evaluated, as well as the logical framework on which these activities and resources are based to produce the desired changes, constitute important and critical parts of any evaluative process.

This explains the position of several authors to assert that complex interventions must account for the results and for the inputs and change processes so as to understand communities as complex systems and how health problems or phenomena of interest are produced by the system8-10. Campbell et al., cited by Stead et al.11, argue that although an intervention is defined as complex the thought of simple interventions can sometimes prevail when one tries to describe the intervention. The authors warn that complex systems should not be taken as an excuse to mean that anything can happen.

Although there is no standardized way to perform the assessment, it can be argued that there is consensus regarding its conceptual definition. Hawe et al.12 indicate that the evaluation is the judgment on something, and they add that the way we judge depends on expectations, past experiences, what we think is important or not. This affects how the assessment is conducted, the interests it serves, and the methods employed. On their part, Brownson et al.13 define evaluation as the process of analyzing programs and policies just as the context within which they occur to determine whether its implementation will require changes and assess intentional and unintentional consequences, which include but are not limited to determining whether they are achieving the goals and objectives.

It is recognized that in the last twenty years public health has shifted in its conception, its fundamentals and practices fed by the principles and values the health promotion strategy drives. During this time, questions have arisen relating to the success of policies and interventions in this field, along with factors determining their success or failure, as well as the costs and economic, social, political, and cultural consequences entailed in reaching them. It is undeniable that Latin America has unique situations characteristic of its political, socioeconomic, and cultural context that must be taken into account both in the formulation of interventions as in evaluating them. A review of the literature, the Latin American experience in evaluating capacity building and participation in various academic and research settings, show that in relation to the conception, practice, and evaluation of health promotion in the region there are still problematical questions on the agenda, such as: What is meant by evidence of effectiveness in interventions whose essential ingredients are influenced by the context, and whose nature departs from standard patterns of analysis grounded only in knowledge and rules of scientific disciplines? Is the operational definition of evidence and effectiveness different from the perspective of public health and health promotion? How do the different conceptions of public health and health promotion influence the methodological approaches in evaluating effectiveness? Do these approaches assess the capacity building process to make decisions and intervene in the structural causes of illness and death? Are epidemiological criteria that were established for judging validity and reliability of information on effectiveness of complex interventions pertinent and applicable? These questions for reliable and useful assessments encourage further search for answers that satisfy not only our intellect but our social responsibility to be active participants in population developmental processes and territorial populations.

Based on the above, this paper presents an exercise on critical thinking about assessment issues, evidence, and effectiveness in responding to the question of whether we are doing what we should, or only what we are able to do, according to circumstances and different realities? Are we actually doing public health from the perspective of health promotion in Latin America?

Given that the experiences and an extensive literature review have revealed the scarcity of published and indexed bibliographical material concerning public-health assessment and health promotion produced in our language and in line with the regional context of Latin America, for this analysis the central input taken were the knowledge of the author’s practice and the results from a large body of evaluation cases or experiences rigorously documented and reported by representatives from different countries of Latin America who have participated in a broad regional initiative for capacity building around the evaluation of effectiveness in health promotion, which has been happening in Latin America for over ten years.

This initiative has been developed with the active participation of the Center for the Development and Evaluation of Public-Health Policies and Technology (CEDETES) at the University of Valle in Colombia, and the WHO/PAHO Collaborating Center , the Foundation for the Development of Public Health in Colombia (FUNDESALUD), and has had the impetus and financial support from international agencies like the International Union for Health Promotion and Education (IUHPE), the Pan American Health Organization (PAHO), the Centers for Disease Control and Prevention (CDC) through a Latin American training program on effectiveness assessment and economic evaluation involving more than 450 professionals from over 18 countries in the region.

This paper raises the central elements of debate and controversy regarding the theory and practice of evaluation of Latin American public-health and health-promotion initiatives, and an analysis of the results from the assessment cases. For this, we have considered the complex nature of the interventions since it demands innovative methodological approaches to evaluation in order to establish valid associations between interventions and outcomes, as well as to detect their so-called «active ingredients» and understand the interactions between them8,9. The analysis of this latter information enables the strengthening of theory and the formulation of new hypotheses about the assumptions or ground on which the practice of health promotion and public health within the Latin American context are based.

HOW WAS THE INFORMATION COLLECTED AND DOCUMENTED FOR THIS ANALYSIS?

This work adopted three methods for collecting information on experiences of effectiveness evaluation for interventions in public health and health promotion in Latin America: review of published literature, exploration of experiences by means of a survey questionnaire with qualitative information and documentation of actual assessment cases designed or implemented in countries as a result of their participation in the Latin American program for evaluation training.

For the indexed literature review, the search and selection criteria were defined as those articles published from 1986 to 2006, reporting on experiences of public health and evaluation in Latin America. We reviewed the Medline and Lilacs online databases. The descriptors employed included «effectiveness», «promotion», «health», «evidence», and «Latin America/Latin America» as were «efectividad», «promoción», «salud», and «evidencia». The review was supplemented with searches by country, using as descriptors: «health promotion» and «promoción de la salud», ending in an inquiry of publications by country in connection with each of the five components of the strategy for health promotion as described in the Ottawa Charter.

The exploration of experiences was conducted by employing a questionnaire and a format for documenting and systemizing published and unpublished experiences in different countries. The questionnaire was validated and distributed via e-mail to a list of 600 public-health practitioners linked to the Latin American Network of Evidence of Effectiveness in Health Promotion, sponsored by the IUHPE and the list of the Network of Municipalities and Healthy Communities, led by PAHO. Information was collected from institutional, key contact, description of the experience of evaluation/intervention in public health and health promotion, objectives, coverage, participants, and partners. It also included information on communication of evaluation results in terms of written materials available on the experience, published or unpublished.

Cases or complete evaluation experiences of health promotion were obtained from the documentation produced by different participants in the Latin American courses and workshop courses on effectiveness evaluation carried out in the region between 2004 and 2009. To develop evaluation cases in the training courses, guided exercises were conducted together with opportunities for individual and collective learning that motivated participants to analyze practices and, as appropriate, to reconsider the evaluation process in each of its phases or in all of them.

To synthesize the information collected through the three search methods, a descriptive matrix was constructed. Matrices were prepared by sub-region, indicating the title of the publication or the name of the experience, year, country, objectives, methods, results, and sources. Indexed information was also sorted and classified by taking into account whether there was access to the full text, abstract, or only the reference title/author/source.

To process and analyze information, descriptive categories were used corresponding to the components of health promotion as defined in the Ottawa Charter –creation of capacities and healthy environments, personal skills, public policy, strengthening community action, and reorienting services, and an additional component was added that emerged as a result of inductive analysis–theories and reflections on health promotion. In a similar manner, for cases and practical evaluation experiences, the analysis of the central elements of the evaluation process were considered as analytic categories, as were the presence of factors that determined the success of the effectiveness, according to an extensive theoretical and methodological framework previously constructed on the subject. These categories were: definition of the problem situation; type, scope, and design of the interventions, along with the implementation of the interventions; evaluation questions, the methodological approaches of the evaluations, and usage of the information.

Concomitant with the thematic analysis, an analysis was carried out per type of study and design, number of studies, year of publication, geographical origin, and judgment of the production from the indexed literature, sources, and languages.

In total, 185 cases were obtained, of which 126 corresponded to bibliographic references according to the search criteria for indexed journals, 34 to experiences collected by means of the questionnaire, and 25 to cases of complete evaluation, documented as a result of some of the evaluation training processes conducted in the region. Of the total indexed publications found, 81% gave access to the full text and abstracts, 44% just to summaries, and 37% provided the complete document. Of the remaining 19%, it was only possible to obtain the title, authors, and publication source. Likewise, of the articles found, merely 29% were reported as evaluation studies, most concerned performance and process and, to a lesser extent, effectiveness (4%). In 18% of the articles reviewed, the type of study was not specified. The total data collected was obtained from 25 countries in South America (46%) and 54% were obtained from Central American and the Caribbean countries. The countries with the greatest number of experiences included Mexico, Brazil, Argentina, Colombia, Peru, and Puerto Rico.

DEVELOPMENT AND DISCUSSION OF THE TOPIC

General overview of practice and evaluation in public health and health promotion in Latin America. In spite of apparent consensus in the world on the need for public health and health promotion to demonstrate their effectiveness and play an increasingly important role in overall public-health policies, different arguments have been outlined on this issue. Some believe that the future of health promotion will depend on its ability to scientifically demonstrate that it is an effective field for public-health action14. Campostrini15 affirms that it is almost impossible to find absolute truth in interventions to promote health and proposes that it is best to observe, measure, and analyze it through its shadow –evidences–for interpreting reality. Therefore, the author recommends that those interested in assessing the effectiveness of interventions in health promotion should seek the «shadow» that is most appropriate, according to the metaphor above.

Stead et al.11 argue that, in most cases, these kinds of interventions are difficult to describe in terms of programs and do not lead to precise statements about the independent variables whose effects can be measured and easily replicated. Interventions that seek political change have additional challenges. Clark and McLeroy16, cited by Stead et al.11, state that –ideally–evaluation should demonstrate that the intervention strategy produced the political changes, but to make this connection –attribution–it is almost impossible.

Given the complex nature of public-health interventions and those of health promotion, it is difficult to establish evidence of these interventions as a resulting proof of causality, because the criteria for assessing the causality allude to biomedical science and a probabilistic notion of a variable that, when preceded by another, produces an effect. Hence, some authors point out that when studying the behavior of individuals, organizations, or political processes, the explanatory power of science is limited because there is difficulty in fitting the rules of natural science to those of social science17.

Hawe et al.8 propose a critical analysis of the intervention logic –a logical framework–to help construct or reconstruct more well-founded and effective interventions and evaluations. The logical framework of the intervention refers to its characterization. This permits one to know what the objectives of the intervention are, the activities implemented, their purposes, and the strategies through which they are to be met. Considering this, the following is an analysis of the state of practice and evaluation of health promotion and public-health interventions in Latin America, taking into account the variables and categories indicated above.

WHERE IS PUBLIC HEALTH AND EVALUATION HEADED IN LATIN AMERICA?

Rhetoric, facts, progress, and frustrations in public health. The definition of the problem or situation one seeks to change sets the type, nature and scope of the intervention, the actions that must be developed and the dimensions of practice, as well as their degree of complexity. What has been traditionally observed is that the analyses of health problems are made from three paradigms: the first assumes that diseases or health problems are products of biology, the second assumes they are products of our individual responsibility because we do not improve our behavior, and the final assumes they are from the so-called «causes of causes» or social determinants.

In the Latin American experiences reviewed, the definition of health problems these interventions seek to address ignore or do not consider the structural causes of the problems and the influence of sectors other than health. In most cases analyzed, the operation of the interventions reflected a definition and approach to health problems from their own proximal causes, such as lifestyles, actions to inform the community about how to control risk factors, such as an unhealthy diet, consumption of psychoactive substances and alcohol use, sedentary lifestyle, and health access and coverage. Factors within the political, social, and cultural contexts causing the problem and influencing the process of achieving change, as well as the indicators of intervention success or failure were absent in the formulation of almost all the experiences. Similarly, and in spite of it being recognized that the poor suffer from bad health18, the social gradient of health within countries and the major health inequities caused by the unequal distribution of power, income, goods and services, and consequent injustices were not taken into account in formulating interventions and documented experiences.

Types of interventions. As a result of the definition of population health problems, the analysis reveals that the practice of public health and health promotion in the region is heavily targeted at interventions aimed at reducing health problems and risks through strategies to improve access to health services, behavioral change, and life-style modification. Latin America still seems quite far from permanent processes and from building sustainable strategies or activities to address social determinants of health and achieve changes in policies and social contexts that promote and maintain adverse situations.

It is noteworthy that among the components of the initiatives, priority is given –in theory–to issues such as participatory management, the social and community organization, networking, and the development of care models that connect comprehensive primary health care with health education. However, the most common problems addressed in practice were those related to coverage and medical care, as well as the control and prevention of disease and risk factors, community actions for the formation and strengthening of social networks, and, to a lesser extent, to the use of evidence for decision making. Public policies to legislate and promote initiatives that seek to exercise the right to health and meet targets for poverty reduction and control as a structural cause of many health problems are mentioned but do not materialize or have real expression in practice. Similarly, concrete strategies and approaches in the interventions were not readily apparent for achieving clear or in-depth changes on some key areas, such as advocacy, strategic alliances to influence decisions that affect health, capacity building, empowerment to be part of decisions affecting individual and population health and actions to balance power relations between key actors; on the contrary, the facts show that empowerment has been associated with activities and educational programs especially related to lifestyles.

If we analyze the other components of health promotion, it is also striking that most of the interventions presented are directed toward the component of service reorganization, through which the prevalence of the biological approach to address health issues is reflected in spite of theoretically insisting on the need to intervene in the determinants of health. Review of indexed literature showed that most studies in health promotion focused on the personal skills component (40%), 17% focused on theories and thoughts on health promotion, and 15% on reorienting services. In this same review it is noteworthy that, of all the publications found, only 8% focused on the component of building opportunities and healthy environments and only 7% of the articles focused on healthy public policies, considering the priority of these topics and the considerable efforts that various countries and organizations have taken in this regard. Pellegrini19 already warned in his study of the inconsistencies between the themes of Latin American publications and the health priorities of those countries.

On the other hand, a survey conducted by PAHO on competencies and the identification of factors influencing success or effectiveness of interventions in health promotion20 showed that there is great difficulty in defining the criteria for «success» or for effectiveness of interventions addressing social problems in a region like Latin America, which remains the most inequitable in the world and whose work in health promotion is mainly focused on individual behavioral changes, without focusing on the major socio-political issues that generate health inequities in the region. It mentions where the actions undertaken are often not successful because there is insufficient conceptual clarity to generate, stimulate, and apply appropriate interventions.

Scope of interventions. The vast majority of reported experiences have been carried out at the community and municipal levels, with a few exceptions at the national level, a fact that somehow reflects the limited commitment of governments to these initiatives. Similarly, most of the initiatives correspond to relatively short periods of government, hindering the achievement of results and influencing the financial sustainability, continuity, and legislative support. The strategies and proposed actions for achieving the objectives are: collaborative in nature, opportune, voluntary, short-term, and have little financial and legal backing to give them continuity. A specific case is that of interventions related to the education sector, centered on training activities for groups at the grassroots level, which are not reflected in policy and institutional and territorial development plans, in legislation, nor in curricular plans for health professionals.

There is a reduced number of cases with an expressed intention and specific proposals for creating conditions that would meet the principles and values on which health promotion is based and the conception of the so-called new public health. Similarly, the experiences reported show the intention of improving the quality of life and welfare of the population as a result of the interventions; however, the actions carried out, as were reported in several cases, were insufficient to promote the expected change, especially when in several countries of Latin America health systems and social protection of the population are geared for privatization of services and access to them rests on the payment capacity of the users.

While one of the core values of health promotion and public health is the right to health of all citizens, in practice, what drives action are individual needs, especially in times of crisis. Hence, the action becomes cyclical and volatile and relates to individuals rather than to citizens. It therefore requires further discussion of the practical meanings of health as a right and on political actions and strategies to build and maintain and, as affirmed by Bambra et al.21, to demonstrate how recognition of the political nature of health will lead to a more effective strategy for health promotion and greater evidence of the effectiveness of its practice.

A potential in the region is the motivation and variety of current interventions to construct healthy environments and within these, the creation of measures to increase service coverage through the Primary-Care Strategy, from which one can visualize actions that could exert influence on the determinants of health. To raise awareness of this potential and enhance the capacity of the promoters of these initiatives on the issues of advocacy, leadership, social management, and public policy, among others, are demands that must be addressed to strengthen and sustain these processes of change.

Design of interventions. The accumulation of experiences and cases analyzed show that the design and operation of the interventions are a reflection of narrow conceptions of health and its causes, as well as on the conception and scope of public health and health promotion. The design of interventions has largely responded to a disciplined vision based on assumptions whose presence is not verified and on a short-term perspective that would confirm and standardize the existing norm. One could say that the programs and activities designed are those the promoters of these initiatives are able to carry out, taking into account only what they know of the problem, as well as that which they are able to do based on knowledge, skills, practice scenarios, and obligations under existing regulations.

The scope and intentions of health promotion and public health demand changes in the power structures within the sector and the state apparatus in general to prevent conflicts and tensions, and make the implementation and results viable. The work from existing structures, maintaining the status quo is an indicator that we are adopting the option of doing only what we can, instead of doing what we should.

In summary, we may conclude that the major emphasis in the practice of public health and health promotion in Latin America has been given to interventions whose design is directed toward individual actions, sectorial and local, dependent upon the norms of the current health system and upon the ability of those responsible for executing them.

Do we create conditions in which interventions are effective? The contribution of the implementation. The most common definition of program implementation is related to the question «how well is a program or intervention put into practice –fidelity»22. The documentation and systematization of practice provides information to answer this question. The implementation of community programs or population health is not easily standardized and perhaps not even desirable. Given the multiple activities involved in these programs and considering they are also guided by principles such as collaborative practice, partnerships, and active participation from community members, documentation of these interventions becomes more necessary. These programs are dynamic and need to respond and adapt to local circumstances and, therefore, require continuous flow of information to understand and assess the implementation process22-25. To this we add that this evidence comes not only from the documentation of processes, but from critical reflection on themselves, from in-depth understanding of the way results are obtained and from the factors that influenced them; that is, it arises from a systemization process with ranges already mentioned in previous sections.

In complex interventions where known and unknown factors interact, and for which it is difficult to foresee all the changes and the resulting effects largely due to its dynamism and frequent changes in the initial protocol of the intervention, it is necessary to clearly. Therefore, it is necessary to clearly identify in practice and in a real context what the process of implementation of the intervention was and whether its critical components have been changed, what are the advances and results, what really made it work, with whom and under what circumstances -impact and effectiveness26. However, despite this awareness, in practice, the art of implementing programs is not often analyzed and, thus, is not reported or published so practical experience is not widely known22,25.

Based on the above, it can be said that information on key aspects of the implementation of interventions has not been frequently collected in the Latin American experiences and in exceptional cases where it has been done, the results are incomplete, disordered, fragmented, and without clear criteria on what to document and how to analyze and use the information produced.

It may be observed that even though most interventions analyzed include one or more of these elements, it is also true that it does not recognize how variables are interrelated, or the operational models on which they are based; that is, following Chen27, they lack a prescriptive theory and, sometimes, the theory of causality is also weak. It would then be safe to say that there is little awareness of the need and importance of documenting and systemizing the implementation process of the interventions and using this information to confront the theory and the underlying logical framework.

Regarding time and funding that, as noted, are crucial variables in a good implementation, the experiences discussed were programmed from one to three years and received funding mainly from governmental sources and external sources such as NGOs and international cooperation agencies. It is known that funding limits the time horizon of the intervention rather than the logic of technical and operational viability for achieving the goals that drive it. It is noteworthy that no experience reported more than three years of duration, and the median was 18 months.

The lack of interest for evaluating health initiatives and the absence of or limited support from directives and major users of information is both cause and consequence of other problems associated with precarious financing, non-existent or non-sustained: insufficient time to conduct evaluation and show results, and limited technical capacity of personnel responsible for these interventions. This becomes a vicious circle, resulting in poor quality of evaluation, little or no validity of the results, limited utilization, and insufficient information to formulate policies and programs and to reorient interventions.

A common limitation in comprehending and evaluating the implementation of the interventions is that, firstly, most of them -about 80%–have no clear and comprehensive conception of what the intervention was and, secondly, in 90% of instances, the way it operated is largely unknown. Hence, there was no clear reference to assessing the relevance and adequacy of actions, as well as the sensitivity of assumptions that supported the interventions. Part of the problem is that the interventions were designed as responses to policies and regulations in management and supervisory levels of the health system, which must be fulfilled without having clarity concerning the underlying issues. The changes that were carried out in some interventions were relatively limited in scope and in its intent, which shows that the executors of these initiatives did what they could under the parameters that they would be evaluated.

What do we ask and what do we know about evaluation? Evaluating the impact of interventions to reduce inequities in health, should establish, according to Mahoney et al.28, articulation among the intervention, practice, health, and equity. In relation to this, high motivation may be observed in Latin America for development and evaluation in health promotion and public health. Despite such, the experiences expose some weaknesses and gaps waiting to be resolved. The evaluation of these initiatives in the region is negligible and presents a weakness in relevance of the questions and study designs and, consequently, it is not uncommon to find evaluation results with little validity and reliability. Similarly, in the cases studied, it was found that the evaluation has responded more to academic interests than to a need felt by those responsible for making decisions, managing programs and allocating resources. This explains in part why many of the results from these evaluations, even those with excellent designs, are often not taken into account for decision making in health, and do not go beyond academic affairs29.

The results of this review also point out that our opportunities to learn and to recognize within the framework of a common practice of population health are inadequate. Latin America still lives with large gaps in relation to ownership and access to knowledge, to connectivity and interconnectivity, and to the informational goods and services produced by them. The reduction among asymmetries and deficiencies of information and inequities in health are part of the challenges that our countries are in debt to overcome.

As it is known, fulfilling the objectives of a policy or program is influenced by the variables of time and place in which the intervention operates; hence, the plausibility of the interventions achieving their objectives must be analyzed against these variables, along with the expected intermediate changes to increase the likelihood of success.

The findings show that most of the evaluation questions in the cases analyzed refer to the success of the intervention; that is, to the accomplishment of the objectives and to the completion of the performance goals and development of the programmed activities. This trend reflects the interest of the evaluators for taking program performance into account with little or no importance given to confronting the assumptions and hypotheses upon which the interventions and their implementation are based; information needed in building theory and orienting change processes, as well as in bridging the gap between theory and practice.

The methodological approaches of the evaluations. The most employed were positivist approaches represented in descriptive studies aimed at identifying changes in behavior by specific population groups. Likewise, it was found that analytical studies to identify changes in the variables of interest at the end of the intervention constituted the second most frequent alternative employed.

In addition to determining whether there are causal associations between the intervention and outcome or, in other words, can one attribute the results or changes achieved to the intervention, the evaluation becomes a learning process that contributes to the success of the intervention. In the latter case, evaluation is seen as an input to negotiate and build capacity for using the results and for what Smutylo30 rightly points out, revealing unseen contributions in an effort to improve rather than testing, to understand instead of finding the responsible party; views evaluation as a generator of knowledge and not just a seeker of merits.

In the same vein, the dilemma between the quantitative and qualitative, rather than being an empirical truth, has been a false idea, a product of executions from different paradigms and schools, «sometimes with much resistance to establishing compatibilities and complementing the two trends»31. This leads to reaffirming what was previously said as to there being no method that can be identified as superior, without analyzing it in light of the purpose and scope of the evaluation, the expected outcomes, the funding, the time to perform the evaluation, and the context. The process of implementing public-health interventions and health promotion is usually a «black box» whose central feature is the lack of information to judge the achievement of the intervention, the aspects that have influenced the implementation and changes that have been produced, adherence to the protocols, the degree of performance of the assumptions, and how they might have affected the results. Not only is it important to respond in the evaluation to questions about what worked, but for whom, how, and under what circumstances.

The evaluation process provides input to identify and understand the interaction of variables that act in implementing the intervention; it establishes a consistency between the theory that forms the basis of the intervention and practice; and, finally, defines what the intervention meant in practice. This information at the same time facilitates the reorientation and adjustment of the logical framework of the intervention and contributes to achieving the objectives and impact of the intervention. For this reason, authors such as Stake and Abma32 advocate the inclusion of approaches that give weight to the term contribution, rather than attribution, which implies conditionality or contextualization. In this sense Pawson33-35, recommends the total study of the system of relationships between the variables and suggests dividing the intervention into its components: mechanisms, context, and outcomes. Mechanisms refer to the ways in which one component causes changes and the process is defined as how individuals interpret and act on the intervention strategy, known as program mechanisms, and context refers to the place and system of interpersonal and social relations.

The results of the analysis show that imprecision in defining the problem and the situation to be changed contributed in many cases to no relationship being found among the evaluation question, the intent, the objectives of the study, and methodological approaches selected. During training, this exercise of reflection and questioning by the participants provided an excellent opportunity to identify gaps in intervention design, the formulation of the question, and the selection of the methodological approach to respond to it.

Conversely, it can be said that the criteria and indicators for assessing the effectiveness of the interventions were too ambitious in relation to what was implemented or was planned to be implemented. A direct consequence of this fact would be that changes were not displayed in relation to the objectives with the conclusion being that the intervention did not work, when in fact what did not work was the design of the intervention or its implementation.

The dilemma is whether we only assess effectiveness by the final results, or also use the intermediate results as an indicator, which are preconditions that lead to the attainment of the final results. In several proposals for evaluation, it was found that capacity building in individuals and specific populations, and similarly in institutions –in other words, systems or practice settings–were connected theoretically to the components of the intervention. Nevertheless, this capacity building remains at the individual level and, in a few instances, at the level of specific groups related to community and institutional structures. This means that the other components of the system were left intact when the intervention took place.

Constructivist approaches to address questions related to the behavior of variables and changes in initial conditions were studied in very few cases. Participants in the training courses recognized that the demands of managers and funding agencies are geared towards impacts and outcomes of interventions, and rarely to the implementation process and influencing factors. Hence, only two of the evaluation cases analyzed tried to fill this gap by applying qualitative techniques to document and systematize the interventions.

Among the methods and sources of information most frequently utilized was the noted use of semi-structured surveys, institutional records, and census data. Only one experience used data from public-health surveillance, supplemented with information from other sources or existing information systems. The discussions around the topic showed that participants were uncertain about the scope of the differing methods and comparative advantages among them, according to the intent of the evaluation.

Uses of the information. When asked if the assessments produce information that decision makers and politicians need and want to receive, Crew and Young36 question the usefulness, relevance, and consistency of the evaluations with the needs and interests of the decision makers.

For the findings of the cases under study, it is understood that rarely the is the information resulting from the evaluation used and when it actually is, it is for accounting to the contracting institution for the work done and from the perspective of those who want to know according to the legal framework; in other words, they report on the number of activities or services provided according to programming. It is clear that a culture of performing evaluations does not exist and less so of using the results to redirect programs or elaborate on the theory on which they are based. There are two gaps that have been identified around the connection between knowledge and action: the first is related to research and policy, and the second to the gap between knowledge and action.

However, new initiatives have emerged, such as processes for building partnerships and exchanges; it has been pointed out that despite 30 years of research in this area we still lack a robust evidence base and one that can be generalized to inform decision makers about strategies for promoting the introduction of guidelines and protocols or other measures on the use of evidence37.

Other aspects influencing the use of evidence in decision making and in practice are related to the challenges raised from the lack of demand for such evidence, as well as the high mobility of policy makers, governance processes, and the dependency on donors. Hence, to implement good evidence at the global level requires a triangulation of it with local knowledge. The latter can be achieved through sustained processes of social participation38.

Likewise, advocacy has been considered an important strategy that can help to close the breach between knowledge and action.  Klaudt, cited by WHO18, argues that often the right knowledge reaches people, but they are not able to turn it into action due to pressure, inertia both in society and in institutions. While there is awareness and motivation to do advocacy, few researchers have sufficient capacity to do so and also show incapacity in efforts to work in collaboration with specialists on the subject.

A case study on the use of evidence in policymaking39 shows some strengths and limitations pointed out repeatedly by participants themselves to increase the use of research in developing policies. Among the strengths mentioned, the existence of an organizational approach for policy formulation based on evidence that at the same time was recognized as time consuming and the existence of a close relationship between researchers and policy makers, which could be influenced by conflicts of interest between these two players. As for the two main weaknesses mentioned, included were the lack of resources and the presence of conflicts of interest.

CONCLUSIONS

Faced with the evaluation of effectiveness in health promotion and public health in Latin America, we can say that we are doing what we can, not what we should, which is understandable and expected, but not accepted, even in the presence of political systems, social structures, and legislation that contradict and moreover conflicts with the philosophy and ethical principles underlying the interventions. According to the findings of this review, Latin American public health governed by the principles and values of health promotion is in danger of becoming rhetoric and a healthy intention with little chance of success if strategies and effective concrete mechanisms are not created to influence the structural factors that impede or limit its implementation and results.

Despite the limitations noted, there is no doubt of the great potential and motivation in the region to work on the evaluation research and practice, incorporating different levels, structures, and organisms of power, as well as key players such as the organized community, professional associations, educational institutions, and service providers in public and private institutions, among others.

It must be recognized that evaluation is perhaps the most suitable and useful tool to strengthen both the theory and practice of health promotion and public health. This allows identifying, explaining, and assessing the associations among inputs, impacts and results, along with facilitating identification of the core components of the intervention and their interaction. Hence, for the evaluation to fulfill this role, it must necessarily start from a broad knowledge on the object of evaluation, in this instance, from policies and programs. Understanding and appreciating the practice of health promotion and public health and its evaluation are not only important but necessary, given that our practice takes place in differing contexts from those that gave rise to theories and methodological approaches that have prevailed up until now; therefore, we are obligated to learn from them in a lasting way.

However, one lesson extracted from the experience and from other analyses and studies points out that public health and health promotion based on evidence may be viewed merely as matters of «fashion». Therefore, it is necessary to strengthen the knowledge, skills, and practical sense to produce, analyze and use information and evidence to characterize the problems, define strategies and programs to address them, and identify and assess the contextual conditions that increase the probabilities of success. Otherwise, we are at risk of losing the experience and wealth that practice offers us and therefore its contribution to the design, planning, and implementation of interventions from our own realities, needs, and demands.

In general, one can say that the analysis of reported interventions expose a great potential to act in favor of enforcing the purposes and objectives that drive health promotion, as well as to reduce the broad gaps among countries and among participants and to fill breeches in individual competencies and infrastructure of the countries in relation to their capacity to design, operate, and evaluate interventions.

The general situation presented and the insights that emerge from this analysis have a tacit intention of bringing stakeholders to confront the findings on what they have done to respond to the question of whether they are doing what they should or what they believe they can do according to their own reality; to confront that question in the long run that still needs to be traveled; and, especially, to motivate themselves to exploit their individual and institutional potential to position and strengthen public health from a policy and social perspective, i.e., from the perspective of health promotion.

REFERENCES

  1. Campostrini, E. Measurement and effectiveness. Methodological considerations, issues and possible solutions. In: McQueen D, Jones C (eds.) Global perspectives on health promotion effectiveness. New York: Springer; 2007.
  2. De Salazar L, Anderson L. Health promotion evaluation in the Americas: divergent and common ground. In: Potvin L, McQueen D (eds.), Health promotion evaluation practices in the Americas. Values and research. New York: Springer; 2008.
  3. McQueen D, Anderson L. Evaluation in health promotion. Principles and perspectives. WHO Regional Publications European series, Nº 92; 2000.
  4. Potvin L, Haddad S, Frohlich K. Beyond process and outcome evaluation: a comprehensive approach for evaluating health promotion programmes. In: Rootman I, Goodstadt M, Hyndman B, McQueen DV, Potvin L, Springett J, Ziglio E (eds.). Evaluation in health promotion. Principles and perspectives. World Health Organization, WHO, Regional Publications, European Series, Nº 92; 2000.
  5. McQueen D. Evidence and theory. Continuing debates on evidence and effectiveness. In: McQueen D, Jones C (eds.) Global perspectives on health promotion effectiveness. IUHPE, New York: Springer; 2007.
  6. Brownson RC, Baker EA, Leet TL, Gillespie KN. Evidence-based public health. New York: Oxford University Press; 2003.
  7. Borja-Aburto, VH. Estudios ecológicos. Salud Publica Mex. 2000; 42: 533-538.
  8. Hawe P, Shiell A, Riley T. Complex interventions: how «out of control» can a randomized controlled trial be? Br Med J. 2004; 328: 1561-3.
  9. Shiell A, Hawe P, Gold L. Complex interventions or complex systems? Implications for health economic evaluation. Br Med J. 2008; 336: 1281-3.
  10. Campbell M, Fitzpatrick R, Haines A, Kinmonth AL, Sandercock P, Spiegelhalter D, et al. Framework for design and evaluation of complex interventions to improve health. Br Med J. 2000; 321: 694–6.
  11. Stead M, Hasting G, Eadie D. The challenge of evaluating complex interventions: a framework for evaluating media advocacy. Health Edu Res. 2002; 17: 351-64.
  12. Hawe P, Degeling D, Hall J, Brierley A. Evaluating health promotion: A health worker guide. Sydney: Maclennan and Petty Ltd.; 2003.
  13. Brownson RC, Baker EA, Leet TL, Gillespie KN. Evidence-based public health. New York: Oxford University Press; 2003.
  14. Potvin L, McQueen D (eds.). Health promotion evaluation practices in the Americas. Values and research. New York: Springer; 2008.
  15. Campostrini E. Measurement and effectiveness. Methodological considerations, issues and possible solutions. In: McQueen D, Jones C (eds.) Global perspectives on health promotion effectiveness. New York: Springer; 2007.
  16. Clark NM, McLeroy KR. Reviewing the evidence for health promotion in the United States. In: Davies JK, MacDonald G (eds.). Quality, evidence and effectiveness in health promotion. London: Routledge; 1998. p. 21-46.
  17. Tang KC, Ehsani JP, McQueen DV. Evidence based public health policy and practice. Evidence-based health promotion: recollections, reflections, and reconsiderations. J Epidemiol Comm Health. 2003; 57: 841-3.
  18. World Health Organization (WHO). Bridging the «know-do» gap. Meeting on knowledge translation in global health. Geneva: World Health Organization; 2006.
  19. Pellegrini A. Ciencia en pro de la salud. Notas sobre la organización de la actividad científica para el desarrollo de la salud en América Latina y el Caribe. Washington, D.C.: Organización Panamericana de la Salud; 2000.
  20. Organización Panamericana de la Salud (OPS). Identificación de ofertas de formación y sondeo de competencias para la promoción de la salud en América Latina. Reporte, OPS, Unidad de Entornos y Comunidades Saludables, con el apoyo de FUNDESALUD, Bogotá; 2008.
  21. Bambra C, Fox D, Scott-Samuel A. Towards a politics of health. Health Promo Internat. 2005; 20: 187-93.
  22. Durlak JA. Why program implementation is important. J Prev Intervent Comm. 1998; 17: 5-18.
  23. Dane A, Schneider B. Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clin Psychol Review. 1998; 18: 23-45.
  24. Domitrovich C, Greenberg M. The study of implementation: Current findings from effective programs that prevent mental disorders in school-aged children». J Edu Psychol Consult. 2000; 11: 193-221.
  25. Mihalic S, Fagan A, Irwin K, Ballard D, Elliott D. Blueprints for violence prevention replications: Factors for implementation success. Boulder: Center for the Study of Prevention of Violence, Institute of Behavioral Science, University of Colorado; 2002.
  26. Barry M. Researching the implementation of community mental health promotion programmes. Health Prom J Australia. 2007; 18: 240-6.
  27. Chen HT. Theory-driven evaluations. In Walberg H (ed.). Advances in educational productivity. Greenwich: JAE Press; 1998. p. 15-34.
  28. Mahoney M, Simpson S, Harris E, Aldrich R, Stewart WJ. Equity focused health impact assessment framework. The Australasian Collaboration for Health Equity Impact Assessment (ACHEIA); 2004.
  29. Salazar L de. Promoción de la salud y evidencias de su efectividad: Un reto metodológico y político. En: Arroyo H (ed.). Promoción de la salud en América Latina: modelos, estructuras y visión crítica. San Juan: Consorcio Interamericano de Universidades y Centros de Formación en Promoción de la Salud (UIPES); 2004.
  30. Smutylo T. Impacto latente, atribución oculta: cómo superar las amenazas al aprendizaje en los programas de desarrollo. Ottawa: Unidad de Evaluación, Centro Internacional de Investigaciones para el Desarrollo (CIID-IDRC); 2001.
  31. Cerda GH. La investigación total. La unidad metodológica en la investigación científica. Mesa Redonda Magisterio. Bogotá; 1994.
  32. Stake RE, Abma TA. Responsive evaluation. In: Mathison S (ed.). Encyclopedia of evaluation. Thousand Oaks: Sage; 2005. p. 376-9.
  33. Pawson R. Evidence and policy and naming and shaming. ESRC UK Centre for Evidence-Based Policy and Practice; 2001.
  34. Pawson R. Evidence-based policy: the promise of realist synthesis. Evaluation. 2002; 8: 340-58.
  35. Pawson R. Nothing as practical as a good theory. Evaluation. 2003; 9: 471-90.
  36. Crew E, Young J. Bridging research and policy: context, evidence and links. Working Paper. London: Overseas Development Institute; 2002.
  37. Jong-Wook L. World report on knowledge for better health. In: Research policy. World Health Organization, [online]. 2004 (Consulted 3 September 2009). URL available in: http://www.who.int/rpc/meetings/wr2004/en/index.html
  38. Grimshaw JM, Thomas RE, Mac Lennan G, Fraser C, Ramsay CR, Vale R. et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technology Assessment; 2004; 8: 1-72.
  39. Lavis J, Moynihan R, Oxman A, Paulsen E. Evidence-informed health policy 4. Case descriptions of organizations that support the use of research evidence. In: Implementation science, vol. 3, Nº 56, [online]. 2008 (Consulted 3 September 2009. URL available in: http://www.implementationscience.com/content/3/1/56

© Copyright 2010 - Revista Colombia Médica

Home Faq Resources Email Bioline
© Bioline International, 1989 - 2024, Site last up-dated on 01-Sep-2022.
Site created and maintained by the Reference Center on Environmental Information, CRIA, Brazil
System hosted by the Google Cloud Platform, GCP, Brazil