Acessibilidade / Reportar erro

Advancing impact evaluation in applied limnology

Avançando em avaliações de impacto na limnologia aplicada

Abstracts

Abstract

Accurate impact evaluations of different interventions are paramount in environmental sciences. In this context, the main challenge is to identify causal relationships to understand how different interventions affect the systems of interest. For this task, the counterfactual thinking can be used to estimate the impacts of interventions on the real scale of the problem using observational data. By definition, counterfactuals are states contrary to facts. In the context of interventions, they are the states of the units of analysis in the absence of intervention. This approach allows one to estimate the impact more accurately by comparing the differences between factual and counterfactual states. In this essay, we present some basic elements of the counterfactual thinking and discuss how it, based on experiences in other areas (e.g., medicine and economics), may be useful for the research of complex problems in aquatic ecology.

Keywords:
applied limnology; causality; counterfactual; impact evaluation; intervention


Resumo

Avaliações de impacto acuradas de diferentes intervenções são necessidades na área ambiental. Nesse contexto, o principal desafio é identificar relações causais com o objetivo de compreender como diferentes intervenções afetam os sistemas de interesse. Para tal, a lógica contrafactual pode ser usada para estimar impactos de intervenções na escala real do problema utilizando dados observacionais. Contrafactuais são estados contrários aos fatos. No contexto de intervenções, seriam os estados das unidades intervindas na ausência da intervenção. Essa abordagem permite estimar o impacto de maneira mais acurada ao comparar as diferenças entre os estados factuais e contrafactuais. Nesse ensaio, apresentamos alguns elementos básicos da lógica contrafactual e discutimos como essa lógica, considerando o que ocorre em outras áreas (e.g., medicina e economia), pode ser útil para a pesquisa de problemas complexos em ecologia aquática.

Palavras-chave:
avaliação de impacto; causalidade; contrafactual; limnologia aplicada; intervenção


1. Introduction

Suppose a conservation program aiming to restore riparian forests and improve the water quality in a watershed. In this scenario, the need for an impact evaluation analysis would be paramount to assess whether this intervention reached the expected goals (Frondel & Schmidt, 2005FRONDEL, M. and SCHMIDT, C.M. Evaluating environmental programs: The perspective of modern evaluation research. Ecological Economics, 2005, 55(4), 515-526. http://dx.doi.org/10.1016/j.ecolecon.2004.12.013.
http://dx.doi.org/10.1016/j.ecolecon.200...
; Ferraro & Pattanayak, 2006FERRARO, P.J. and PATTANAYAK, S.K. Money for nothing? A call for empirical evaluation of biodiversity conservation investments. PLoS Biology, 2006, 4(4), e105. http://dx.doi.org/10.1371/journal.pbio.0040105. PMid:16602825.
http://dx.doi.org/10.1371/journal.pbio.0...
; Ferraro, 2009FERRARO, P.J. Counterfactual thinking and impact evaluation in environmental policy. New Directions for Evaluation, 2009, 122(122), 75-84. http://dx.doi.org/10.1002/ev.297.
http://dx.doi.org/10.1002/ev.297...
). The results of this analysis could justify (or not) the application of such intervention in other watersheds. However, even though most ecologists would agree about the need for good programs to evaluate the impact of interventions, they are rare in applied ecology (Ferraro & Pattanayak, 2006FERRARO, P.J. and PATTANAYAK, S.K. Money for nothing? A call for empirical evaluation of biodiversity conservation investments. PLoS Biology, 2006, 4(4), e105. http://dx.doi.org/10.1371/journal.pbio.0040105. PMid:16602825.
http://dx.doi.org/10.1371/journal.pbio.0...
), especially in applied limnology. Indeed, several papers have emphasized the need for more rigorous evaluations of the effectiveness of conservationist interventions (e.g., Kleiman et al., 2000KLEIMAN, D.G., READING, R.P., MILLER, B.J., CLARK, T.W., SCOTT, J.M., ROBINSON, J., WALLACE, R.L., CABIN, R.J. and FELLEMAN, F. Improving the evaluation of conservation programs. Conservation Biology, 2000, 14(2), 356-365. http://dx.doi.org/10.1046/j.1523-1739.2000.98553.x.
http://dx.doi.org/10.1046/j.1523-1739.20...
; Pullin & Knight, 2001PULLIN, A.S. and KNIGHT, T.M. Effectiveness in conservation practice: pointers from medicine and public health. Conservation Biology, 2001, 15(1), 50-54. http://dx.doi.org/10.1111/j.1523-1739.2001.99499.x.
http://dx.doi.org/10.1111/j.1523-1739.20...
; Salafsky et al., 2002SALAFSKY, N., MARGOLUIS, R., REDFORD, K.H. and ROBINSON, J.G. Improving the practice of conservation: a conceptual framework and research agenda for conservation science. Conservation Biology, 2002, 16(6), 1469-1479. http://dx.doi.org/10.1046/j.1523-1739.2002.01232.x.
http://dx.doi.org/10.1046/j.1523-1739.20...
; Salafsky & Margoluis, 2003SALAFSKY, N. and MARGOLUIS, R. What conservation can learn from other fields about monitoring and evaluation. Bioscience, 2003, 53(2), 120-122. http://dx.doi.org/10.1641/0006-3568(2003)053[0120:WCCLFO]2.0.CO;2.
http://dx.doi.org/10.1641/0006-3568(2003...
). Also, Ferraro & Pattanayak (2006)FERRARO, P.J. and PATTANAYAK, S.K. Money for nothing? A call for empirical evaluation of biodiversity conservation investments. PLoS Biology, 2006, 4(4), e105. http://dx.doi.org/10.1371/journal.pbio.0040105. PMid:16602825.
http://dx.doi.org/10.1371/journal.pbio.0...
highlighted that effectiveness of interventions are frequently assumed without convincing evidence.

However, designing good evaluation programs is far from being trivial. To rigorously evaluate the effect of an intervention it is necessary to think how it could potentially change causal relationships, which in turn would change the outcomes of interest (Pearl & Mackenzie, 2018PEARL, J. and MACKENZIE, D. The Book of Why: The new science of cause and effect. New York: Basic Books, 2018.). And, to establish a causal relationship, it is necessary that the factors involved interact in a way that the change in the state of one factor (i.e. cause factor) results in change in the state of other factor (i.e. effect factor; Pearl, 2009PEARL, J. Causal inference in statistics: An overview. Statistics Surveys, 2009, 3(0), 96-146. http://dx.doi.org/10.1214/09-SS057.
http://dx.doi.org/10.1214/09-SS057...
). More specifically, a causal relationship is one that X causes Y, in which Y would not exist if X did not exist (i.e., attribution) or one that X contributes, together with other factors, to change the state of Y (i.e., contribution; White, 2010WHITE, H.A. Contribution to current debates in impact evaluation. Evaluation, 2010, 16(2), 153-164. http://dx.doi.org/10.1177/1356389010361562.
http://dx.doi.org/10.1177/13563890103615...
). Such causal relationship cannot be studied by simply using models that relate ‘predictive’ and ‘response’ variables. Therefore, to study causal relationships, such as the relationship between a certain intervention and its potential outcomes in an impact evaluation study, it is necessary to manipulate cause variables in the system of interest to estimate how such manipulation alters subsequent causal relationships and results in changes in the distributions of effect variables (Holland, 1986HOLLAND, P.W. Statistics and causal inference. Journal of the American Statistical Association, 1986, 81(396), 945-960. http://dx.doi.org/10.1080/01621459.1986.10478354.
http://dx.doi.org/10.1080/01621459.1986....
; Pearl & Mackenzie, 2018PEARL, J. and MACKENZIE, D. The Book of Why: The new science of cause and effect. New York: Basic Books, 2018.).

2. Traditional Impact Evaluations in Environmental Sciences

Disregarding causal relationships in the approaches used to evaluate the impact of interventions is still common in environmental sciences (Ferraro & Pattanayak, 2006FERRARO, P.J. and PATTANAYAK, S.K. Money for nothing? A call for empirical evaluation of biodiversity conservation investments. PLoS Biology, 2006, 4(4), e105. http://dx.doi.org/10.1371/journal.pbio.0040105. PMid:16602825.
http://dx.doi.org/10.1371/journal.pbio.0...
). In general, the efficiency of interventions in environmental sciences is commonly studied by monitoring a set of indicator variables (see Box 1). According to Ferraro (2009)FERRARO, P.J. Counterfactual thinking and impact evaluation in environmental policy. New Directions for Evaluation, 2009, 122(122), 75-84. http://dx.doi.org/10.1002/ev.297.
http://dx.doi.org/10.1002/ev.297...
: “Environmental scientists and practitioners often assume that evaluation is simply an act of taking a careful look at the monitoring data. If the indicator improves, a program is deemed to be working. If the indicator worsens, one infers the program is failing.” Even considering the importance of monitoring to understand and track how interventions are working, this approach has little to do with impact evaluations (Perrin, 2012PERRIN, B. Linking monitoring and evaluation to impact evaluation. Impact Evaluation Notes, 2012, 2.).

Table100
Different approaches to analyze the impact of a hypothetical restoration program in watersheds.

Another common strategy in the environmental sciences is the comparison of intervened units and different non-intervened units, for example, in a Before-After-Control-Impact (BACI) framework (Smith, 2002SMITH, E.P. BACI Design. In: A. H. EL-SHAARAWI and W. W. PIEGORSCH. Encyclopedia of environmetrics. Chichester: John Wiley & Sons, 2002, pp. 141-148.; Chevalier et al., 2018CHEVALIER, M., RUSSEL, J.C. and KNAPE, J. New measures for evaluation of environmental perturbations using BACI analyses. Ecological Applications, 2018, 29(2), 1-12.). However, these units are not necessarily similar considering features that could affect both the selection of units that could receive the intervention and the causal relationships that could result in changes in the outcome variables (see Box 1). And, by disregarding the importance of these differences, a study could under or overestimate the impacts of interventions (as shown by Andam et al., 2008ANDAM, K.S., FERRARO, P.J., PFAFF, A., SANCHEZ-AZOFEIFA, G.A. and ROBALINO, J.A. Measuring the effectiveness of protected area networks in reducing deforestation. Proceedings of the National Academy of Sciences of the United States of America, 2008, 105(42), 16089-16094. http://dx.doi.org/10.1073/pnas.0800437105. PMid:18854414.
http://dx.doi.org/10.1073/pnas.080043710...
). For example, suppose that a protected area is created on a steep terrain with low agricultural potential. In order to evaluate the impact of this protected area, also suppose that the deforestation rates of the protected area are compared to the deforestation rates of unprotected areas, however, these unprotected areas are in flat regions with high agricultural potential. The evaluation of the protected area’s impact in avoiding deforestation would probably be overestimated because deforestation rates in the protected area would be low compared to unprotected areas, even without legal protection. In addition, in this hypothetical example, we could not state that the intervention (i.e., the creation of the protected area) caused the reduction in deforestation rates as the area was already prompted to have low deforestation rates due to its low economic potential to human use.

3. Counterfactual Alternatives

Surely, experimentation is one of the most powerful approaches to establish causal relationships. In this approach, casualization ensures the independence between observations and one can accurately estimate the impact of an intervention when comparing experimental (or “treated”) units with control units (Raper, 2019RAPER, S. Turning points: Fisher’s random idea. Significance, 2019, 16(1), 20-23. http://dx.doi.org/10.1111/j.1740-9713.2019.01230.x.
http://dx.doi.org/10.1111/j.1740-9713.20...
). However, conducting controlled experiments is often infeasible due to several reasons such as resource limitation, ethical issues, practical and logistical difficulties (Underwood, 1992UNDERWOOD, A.J. Beyond BACI: The detection of environmental impacts on populations in the real, but variable, world. Journal of Experimental Marine Biology and Ecology, 1992, 161(2), 145-178. http://dx.doi.org/10.1016/0022-0981(92)90094-Q.
http://dx.doi.org/10.1016/0022-0981(92)9...
; Lan & Yin, 2017LAN, J. and YIN, R. Research trends: Policy impact evaluation: Future contributions from economics. Forest Policy and Economics, 2017, 83, 142-145. http://dx.doi.org/10.1016/j.forpol.2017.07.009.
http://dx.doi.org/10.1016/j.forpol.2017....
). Furthermore, extrapolating the results of a “classical” (small-scale) experiment to the scale of interest may result in flawed conclusions given that processes may be far different comparing experimental and real scenarios (Cook et al., 2008COOK, T.D., SHADISH, W.R. and WONG, V.C. Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within‐study comparisons. Journal of Policy Analysis and Management, 2008, 27(4), 724-750. http://dx.doi.org/10.1002/pam.20375.
http://dx.doi.org/10.1002/pam.20375...
). This is the main reason for using observational data to estimate impact of interventions.

Counterfactual thinking can be used to overcome the problems associated with “classical” experimentation (Ferraro & Pattanayak, 2006FERRARO, P.J. and PATTANAYAK, S.K. Money for nothing? A call for empirical evaluation of biodiversity conservation investments. PLoS Biology, 2006, 4(4), e105. http://dx.doi.org/10.1371/journal.pbio.0040105. PMid:16602825.
http://dx.doi.org/10.1371/journal.pbio.0...
; Pearl & Mackenzie, 2018PEARL, J. and MACKENZIE, D. The Book of Why: The new science of cause and effect. New York: Basic Books, 2018.) by comparing factual states with states that contradict factual situations (i.e. counterfactual states; Pearl & Mackenzie, 2018PEARL, J. and MACKENZIE, D. The Book of Why: The new science of cause and effect. New York: Basic Books, 2018.). In the example of the effect of riparian restoration on the water quality of streams (see Box 1), we would have to evaluate the state of these same streams in situations where the restoration was not implemented. However, we have the fundamental problem of causal inference: a given unity (stream in our example; see Box 1) cannot be in the same state, with and without the intervention, at the same time (Holland, 1986HOLLAND, P.W. Statistics and causal inference. Journal of the American Statistical Association, 1986, 81(396), 945-960. http://dx.doi.org/10.1080/01621459.1986.10478354.
http://dx.doi.org/10.1080/01621459.1986....
). Hence, counterfactual states do not exist and they should be estimated. After, to estimate the impact of interventions, counterfactual states are compared to factual states (e.g., Andam et al., 2008ANDAM, K.S., FERRARO, P.J., PFAFF, A., SANCHEZ-AZOFEIFA, G.A. and ROBALINO, J.A. Measuring the effectiveness of protected area networks in reducing deforestation. Proceedings of the National Academy of Sciences of the United States of America, 2008, 105(42), 16089-16094. http://dx.doi.org/10.1073/pnas.0800437105. PMid:18854414.
http://dx.doi.org/10.1073/pnas.080043710...
; McConnachie et al., 2015MCCONNACHIE, M.M., VAN WILGEN, B.W., FERRARO, P.J., FORSYTH, A.T., RICHARDSON, D.M., GAERTNER, M. and COWLING, R.M. Using counterfactuals to evaluate the cost-effectiveness of controlling biological invasions. Ecological Applications, 2015, 26(2), 475-483. http://dx.doi.org/10.1890/15-0351. PMid:27209789.
http://dx.doi.org/10.1890/15-0351...
; Sonter et al., 2017SONTER, L.J., HERRERA, D., BARRETT, D.J., GALFORD, G.L., MORAN, C.J. and SOARES-FILHO, B.S. Mining drives extensive deforestation in the Brazilian Amazon. Nature Communications, 2017, 8(1), 1013. http://dx.doi.org/10.1038/s41467-017-00557-w. PMid:29044104.
http://dx.doi.org/10.1038/s41467-017-005...
). But how to apply the counterfactual thinking to evaluate the impact of interventions? One possibility consists in estimating counterfactual states similarly to situations of controlled experiments, but with observational data (e.g., Andam et al., 2008ANDAM, K.S., FERRARO, P.J., PFAFF, A., SANCHEZ-AZOFEIFA, G.A. and ROBALINO, J.A. Measuring the effectiveness of protected area networks in reducing deforestation. Proceedings of the National Academy of Sciences of the United States of America, 2008, 105(42), 16089-16094. http://dx.doi.org/10.1073/pnas.0800437105. PMid:18854414.
http://dx.doi.org/10.1073/pnas.080043710...
; McConnachie et al., 2015MCCONNACHIE, M.M., VAN WILGEN, B.W., FERRARO, P.J., FORSYTH, A.T., RICHARDSON, D.M., GAERTNER, M. and COWLING, R.M. Using counterfactuals to evaluate the cost-effectiveness of controlling biological invasions. Ecological Applications, 2015, 26(2), 475-483. http://dx.doi.org/10.1890/15-0351. PMid:27209789.
http://dx.doi.org/10.1890/15-0351...
; Sonter et al., 2017SONTER, L.J., HERRERA, D., BARRETT, D.J., GALFORD, G.L., MORAN, C.J. and SOARES-FILHO, B.S. Mining drives extensive deforestation in the Brazilian Amazon. Nature Communications, 2017, 8(1), 1013. http://dx.doi.org/10.1038/s41467-017-00557-w. PMid:29044104.
http://dx.doi.org/10.1038/s41467-017-005...
).

Understanding how different features influence the causal relationship between the intervention and the outcome of interest is crucial to estimate counterfactual states (Imbens & Rubin, 2015IMBENS, G.W. and RUBIN, D.B. Causal inference for statistics, social, and biomedical sciences. New York: Cambridge University Press, 2015. http://dx.doi.org/10.1017/CBO9781139025751.
http://dx.doi.org/10.1017/CBO97811390257...
). In our hypothetical example, we could envisage the following counterfactual state: “how would water quality be without the intervention?” In this example, we know that water quality is influenced by nutrient inputs from terrestrial ecosystems and this kind of knowledge should be taken into account to estimate adequate counterfactual states. After considering which features are important (e.g., limnological characteristics, nutrient inputs and type of soil), different methods (e.g., matching; see Stuart, 2010STUART, E.A. Matching methods for causal inference: A review and a look forward. Statistical science: a review journal of the Institute of Mathematical Statistics, 2010, 25(1), 1-21.) can be applied to select control areas comparable, according to these features, to impacted areas. Hence, it is possible to estimate more accurate counterfactual states to compare with the factual states (i.e., units under the intervention) and, finally, estimate the impact of an intervention with the required rigor (see Box 1). Many methods can be used to estimate counterfactual states, however, a detailed description of these methods is out of the scope of this paper (interested readers should consult Stuart (2010)STUART, E.A. Matching methods for causal inference: A review and a look forward. Statistical science: a review journal of the Institute of Mathematical Statistics, 2010, 25(1), 1-21., Imbens & Rubin (2015)IMBENS, G.W. and RUBIN, D.B. Causal inference for statistics, social, and biomedical sciences. New York: Cambridge University Press, 2015. http://dx.doi.org/10.1017/CBO9781139025751.
http://dx.doi.org/10.1017/CBO97811390257...
and Pearl & Mackenzie (2018)PEARL, J. and MACKENZIE, D. The Book of Why: The new science of cause and effect. New York: Basic Books, 2018. for a practical and theoretical introduction to the topic).

Counterfactual thinking has been applied in environmental sciences to estimate, for example, the effects of legislations focused on endangered species (Ferraro et al., 2007FERRARO, P.J., MCINTOSH, C. and OSPINA, M. The effectiveness of the US endangered species act: An econometric analysis using matching methods. Journal of Environmental Economics and Management, 2007, 54(3), 245-261. http://dx.doi.org/10.1016/j.jeem.2007.01.002.
http://dx.doi.org/10.1016/j.jeem.2007.01...
), the effectiveness of protected areas (Andam et al., 2008ANDAM, K.S., FERRARO, P.J., PFAFF, A., SANCHEZ-AZOFEIFA, G.A. and ROBALINO, J.A. Measuring the effectiveness of protected area networks in reducing deforestation. Proceedings of the National Academy of Sciences of the United States of America, 2008, 105(42), 16089-16094. http://dx.doi.org/10.1073/pnas.0800437105. PMid:18854414.
http://dx.doi.org/10.1073/pnas.080043710...
), programs to control invasive species (McConnachie et al., 2015MCCONNACHIE, M.M., VAN WILGEN, B.W., FERRARO, P.J., FORSYTH, A.T., RICHARDSON, D.M., GAERTNER, M. and COWLING, R.M. Using counterfactuals to evaluate the cost-effectiveness of controlling biological invasions. Ecological Applications, 2015, 26(2), 475-483. http://dx.doi.org/10.1890/15-0351. PMid:27209789.
http://dx.doi.org/10.1890/15-0351...
) and ecosystem services payments (Pattanayak et al., 2010PATTANAYAK, S.K., WUNDER, S. and FERRARO, P.J. Show me the money: do payments supply environmental services in developing countries? Review of Environmental Economics and Policy, 2010, 4(2), 254-274. http://dx.doi.org/10.1093/reep/req006.
http://dx.doi.org/10.1093/reep/req006...
). Nonetheless, similar studies in different ecosystems and with the goal of evaluating different interventions are still rare. For example, as far as we know, there is no study based on counterfactual thinking that estimated the effect of interventions on continental aquatic ecosystems. However, we believe that limnology and related fields will greatly benefit from adopting counterfactual thinking. Furthermore, it is urgent to estimate the impact of interventions in continental waters if we want to influence public policies and future conservation actions. In summary, we need to multiply the example of the classical study of Schindler & Fee (1974)SCHINDLER, D.W. and FEE, E.J. Experimental lakes area: Whole‐lake experiments in eutrophication. Journal of the Fisheries Research Board of Canada, 1974, 31(5), 937-953. http://dx.doi.org/10.1139/f74-110.
http://dx.doi.org/10.1139/f74-110...
, which was essential for the formulation of laws to restrict phosphorus content in effluents.

  • Cite as: Ribas, L. G. S., Padial, A. A. and Bini, L. M. Advancing impact evaluation in applied limnology. Acta Limnologica Brasiliensia, 2019, vol. 31, e101.

References

  • ANDAM, K.S., FERRARO, P.J., PFAFF, A., SANCHEZ-AZOFEIFA, G.A. and ROBALINO, J.A. Measuring the effectiveness of protected area networks in reducing deforestation. Proceedings of the National Academy of Sciences of the United States of America, 2008, 105(42), 16089-16094. http://dx.doi.org/10.1073/pnas.0800437105 PMid:18854414.
    » http://dx.doi.org/10.1073/pnas.0800437105
  • CHEVALIER, M., RUSSEL, J.C. and KNAPE, J. New measures for evaluation of environmental perturbations using BACI analyses. Ecological Applications, 2018, 29(2), 1-12.
  • COOK, T.D., SHADISH, W.R. and WONG, V.C. Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within‐study comparisons. Journal of Policy Analysis and Management, 2008, 27(4), 724-750. http://dx.doi.org/10.1002/pam.20375
    » http://dx.doi.org/10.1002/pam.20375
  • FERRARO, P.J. Counterfactual thinking and impact evaluation in environmental policy. New Directions for Evaluation, 2009, 122(122), 75-84. http://dx.doi.org/10.1002/ev.297
    » http://dx.doi.org/10.1002/ev.297
  • FERRARO, P.J., MCINTOSH, C. and OSPINA, M. The effectiveness of the US endangered species act: An econometric analysis using matching methods. Journal of Environmental Economics and Management, 2007, 54(3), 245-261. http://dx.doi.org/10.1016/j.jeem.2007.01.002
    » http://dx.doi.org/10.1016/j.jeem.2007.01.002
  • FERRARO, P.J. and PATTANAYAK, S.K. Money for nothing? A call for empirical evaluation of biodiversity conservation investments. PLoS Biology, 2006, 4(4), e105. http://dx.doi.org/10.1371/journal.pbio.0040105 PMid:16602825.
    » http://dx.doi.org/10.1371/journal.pbio.0040105
  • FRONDEL, M. and SCHMIDT, C.M. Evaluating environmental programs: The perspective of modern evaluation research. Ecological Economics, 2005, 55(4), 515-526. http://dx.doi.org/10.1016/j.ecolecon.2004.12.013
    » http://dx.doi.org/10.1016/j.ecolecon.2004.12.013
  • HOLLAND, P.W. Statistics and causal inference. Journal of the American Statistical Association, 1986, 81(396), 945-960. http://dx.doi.org/10.1080/01621459.1986.10478354
    » http://dx.doi.org/10.1080/01621459.1986.10478354
  • IMBENS, G.W. and RUBIN, D.B. Causal inference for statistics, social, and biomedical sciences New York: Cambridge University Press, 2015. http://dx.doi.org/10.1017/CBO9781139025751
    » http://dx.doi.org/10.1017/CBO9781139025751
  • KLEIMAN, D.G., READING, R.P., MILLER, B.J., CLARK, T.W., SCOTT, J.M., ROBINSON, J., WALLACE, R.L., CABIN, R.J. and FELLEMAN, F. Improving the evaluation of conservation programs. Conservation Biology, 2000, 14(2), 356-365. http://dx.doi.org/10.1046/j.1523-1739.2000.98553.x
    » http://dx.doi.org/10.1046/j.1523-1739.2000.98553.x
  • LAN, J. and YIN, R. Research trends: Policy impact evaluation: Future contributions from economics. Forest Policy and Economics, 2017, 83, 142-145. http://dx.doi.org/10.1016/j.forpol.2017.07.009
    » http://dx.doi.org/10.1016/j.forpol.2017.07.009
  • MCCONNACHIE, M.M., VAN WILGEN, B.W., FERRARO, P.J., FORSYTH, A.T., RICHARDSON, D.M., GAERTNER, M. and COWLING, R.M. Using counterfactuals to evaluate the cost-effectiveness of controlling biological invasions. Ecological Applications, 2015, 26(2), 475-483. http://dx.doi.org/10.1890/15-0351 PMid:27209789.
    » http://dx.doi.org/10.1890/15-0351
  • PATTANAYAK, S.K., WUNDER, S. and FERRARO, P.J. Show me the money: do payments supply environmental services in developing countries? Review of Environmental Economics and Policy, 2010, 4(2), 254-274. http://dx.doi.org/10.1093/reep/req006
    » http://dx.doi.org/10.1093/reep/req006
  • PEARL, J. Causal inference in statistics: An overview. Statistics Surveys, 2009, 3(0), 96-146. http://dx.doi.org/10.1214/09-SS057
    » http://dx.doi.org/10.1214/09-SS057
  • PEARL, J. and MACKENZIE, D. The Book of Why: The new science of cause and effect New York: Basic Books, 2018.
  • PERRIN, B. Linking monitoring and evaluation to impact evaluation. Impact Evaluation Notes, 2012, 2.
  • PULLIN, A.S. and KNIGHT, T.M. Effectiveness in conservation practice: pointers from medicine and public health. Conservation Biology, 2001, 15(1), 50-54. http://dx.doi.org/10.1111/j.1523-1739.2001.99499.x
    » http://dx.doi.org/10.1111/j.1523-1739.2001.99499.x
  • RAPER, S. Turning points: Fisher’s random idea. Significance, 2019, 16(1), 20-23. http://dx.doi.org/10.1111/j.1740-9713.2019.01230.x
    » http://dx.doi.org/10.1111/j.1740-9713.2019.01230.x
  • SALAFSKY, N. and MARGOLUIS, R. What conservation can learn from other fields about monitoring and evaluation. Bioscience, 2003, 53(2), 120-122. http://dx.doi.org/10.1641/0006-3568(2003)053[0120:WCCLFO]2.0.CO;2
    » http://dx.doi.org/10.1641/0006-3568(2003)053[0120:WCCLFO]2.0.CO;2
  • SALAFSKY, N., MARGOLUIS, R., REDFORD, K.H. and ROBINSON, J.G. Improving the practice of conservation: a conceptual framework and research agenda for conservation science. Conservation Biology, 2002, 16(6), 1469-1479. http://dx.doi.org/10.1046/j.1523-1739.2002.01232.x
    » http://dx.doi.org/10.1046/j.1523-1739.2002.01232.x
  • SCHINDLER, D.W. and FEE, E.J. Experimental lakes area: Whole‐lake experiments in eutrophication. Journal of the Fisheries Research Board of Canada, 1974, 31(5), 937-953. http://dx.doi.org/10.1139/f74-110
    » http://dx.doi.org/10.1139/f74-110
  • SMITH, E.P. BACI Design. In: A. H. EL-SHAARAWI and W. W. PIEGORSCH. Encyclopedia of environmetrics Chichester: John Wiley & Sons, 2002, pp. 141-148.
  • SONTER, L.J., HERRERA, D., BARRETT, D.J., GALFORD, G.L., MORAN, C.J. and SOARES-FILHO, B.S. Mining drives extensive deforestation in the Brazilian Amazon. Nature Communications, 2017, 8(1), 1013. http://dx.doi.org/10.1038/s41467-017-00557-w PMid:29044104.
    » http://dx.doi.org/10.1038/s41467-017-00557-w
  • STUART, E.A. Matching methods for causal inference: A review and a look forward. Statistical science: a review journal of the Institute of Mathematical Statistics, 2010, 25(1), 1-21.
  • UNDERWOOD, A.J. Beyond BACI: The detection of environmental impacts on populations in the real, but variable, world. Journal of Experimental Marine Biology and Ecology, 1992, 161(2), 145-178. http://dx.doi.org/10.1016/0022-0981(92)90094-Q
    » http://dx.doi.org/10.1016/0022-0981(92)90094-Q
  • WHITE, H.A. Contribution to current debates in impact evaluation. Evaluation, 2010, 16(2), 153-164. http://dx.doi.org/10.1177/1356389010361562
    » http://dx.doi.org/10.1177/1356389010361562

Publication Dates

  • Publication in this collection
    13 June 2019
  • Date of issue
    2019

History

  • Received
    20 Feb 2019
  • Accepted
    25 Apr 2019
Associação Brasileira de Limnologia Av. 24 A, 1515, 13506-900 Rio Claro-SP/Brasil, Tel.:(55 19)3526 4227 - Rio Claro - SP - Brazil
E-mail: actalimno@gmail.com