Ecological Relevance With Mechanistic Effect Models Biology Essay

Published: Last Edited:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

As a way to control and limit the impacts of human activities, such as the use of chemical products like pesticides, on the environment, an increasing number of regulations have been enforced in the EU. Especially for what concerns the production and use of agrochemicals, human and environmental risk assessment procedures represent primary instruments to answer the requirements of chemicals control laws. Still, the capability of traditional ERA procedures for predicting actual consequences on biological communities is poor, since they generally do not consider the complex compensatory mechanisms and the interactions between biotic and abiotic components that control natural ecosystems. Moreover, it must be considered that non target organisms are not exposed to a single harmful chemical only, but often to mixtures of toxicants and to different stressors, both anthropogenic and natural.

With my thesis I provide evidence in support of the inclusion of ecological effect models in risk assessment of chemicals. For this purpose I developed population models: this type of modelling approach has been recognized as a good tool for supporting environmental decision-making. This thesis therefore contributes to the application of mechanistic effect models in environmental risk assessment, and the findings could be useful to develop more ecologically relevant ERA procedures. The thesis is focused on the collembolan Folsomia candida, which is a species routinely used in standard toxicity tests (Organisation for Economic Co-operation and Development, 2009), and I have concentrated on environmental risk assessment for terrestrial ecosystems.

In all the experimental work conducted and reported in this thesis, copper has been used as the model toxicant. It is widely used as a pesticide and, for this reason, is the most widely distributed pollutant among all metals. It is thus not only a representative of the important group of pollutants formed by metals, but also highly relevant with respect to the ecological risk assessment of pesticides under Regulation EC 1107/2009.

6.1 Terrestrial risk assessment: adding ecological relevance with mechanistic effect models

From the description given in Chapter 1, it is apparent that the risk assessment scheme for soil invertebrates is rather minimal. Invertebrates, however, play significant, but largely ignored roles in the delivery of ecosystem services by soils at plot and landscape scales (Lavelle et al., 2006). Invertebrate abundance and species richness are used in different soil quality indices. They participate actively in the interactions that develop in soil among physical, chemical and biological processes, and their presence alters the rate and extent of these processes, as well as the physical and chemical structure of the soil(Lavelle et al., 2006). Collembola contribute functionally to the terrestrial food web at several trophic levels, and soil-dwelling springtails decompose plant residues. Therefore, soil invertebrates are key drivers of important ecosystem services such as nutrient cycling and soil formation (Lavelle et al., 2006). In this light, it can be argued that risk assessment schemes for this group of organisms should be more thorough. Nevertheless, it has often been recognized that assessment of the effects and risks of chemicals for the terrestrial environment is a complex matter. As reported by Tarazona et al. (2002), in the past ecotoxicologists have for various reasons focused on aquatic systems, so terrestrial risk assessments have been forced simply to apply the aquatic model to soils, or have focused on specific targets such as risk posed by agrochemical pesticides to birds, bees and beneficial arthropods. This has generated inconsistencies in estimates of risk for different uses of the same active substance, and it is not clear whether they originate from uncertainty, cost/benefit considerations, or the lack of scientific knowledge when the guidelines were set (Tarazona et al., 2002).

Ecological effect models can be an important instrument to add ecological relevance to terrestrial risk assessment. These models are primarily envisioned to be used in current ERA schemes as an alternative refinement option at higher tiers. Population modelling has in fact already been included in the new guidance document on risk assessment for birds and mammals among the options, to be chosen on a case-by-case basis, for higher tier refinement (European Food Safety Authority, 2009). This use of ecological models can be particularly helpful for terrestrial ecosystems, as the current refinement options, primarily ex situ bioassays, Terrestrial Ecosystem Models (TME) and field surveys, have obvious limitations. The main drawbacks they share are the difficulty to link the observed effects to a specific toxic component in the soil and to find a proper reference site or soil (Jensen and Pedersen, 2006). Both these issues could be solved using ecological effect models, which mechanistically link effects at individual or sub-individual levels to the population or community level, and easily allow comparison of different contaminated scenarios to a reference or control situation. In particular, a major strength of population models is the possibility to start simple, and progressively include further complexity if the need arises. In a regulatory context, this means that a simpler model, simulating constant temperature and optimal food conditions can be used as a first step to answer more basic questions. Only where it is necessary to refine the estimate of risk to more realistic conditions, such as variable climatic regimes or more complex exposure scenarios, further parameters or submodels can be implemented. Unlike in field survey, this strategy allows to interpret the results distinguishing between effects of the different factors. To apply it, either two types of models have to be developed, as shown in Chapter 5, or a flexible modelling approach, which allows to easily add more submodels, needs to be used. An example of the latter approach is given in Chapter 4, where three extra submodels have been integrated in the IBM (see Section 4.2) in order for the model to be capable of testing different hypotheses. First, simulations with constant temperature and no disturbances were run (Figs. 4.3 and 4.4), then only a natural disturbance was added (Fig. 4.5), and finally simulations with different patterns of both natural and anthropogenic disturbance factors were run (Fig. 4.6). In this way it was possible to better understand the effects of the different factors, like avoidance behaviour, habitat fragmentation and disturbances, or temperature.

As these results show, spatial heterogeneity in exposure is an important ecological complexity, but is mostly overlooked in environmental risk assessment, especially for terrestrial ecosystems. In fact, it is well known that in soil both natural properties, such as moisture and organic matter concentrations, and abiotic factors, such as chemical contamination, are heterogeneously distributed (Lavelle and Spain, 2001);(Becker et al., 2006), and that this influences the distribution and functioning of soil populations (Hoy and Hall, 1998). From the results of the studies conducted in Chapters 3, 4 and 5 it is apparent that a more realistic exposure assessment can significantly influence estimates of risk for soil organisms. Exposure in soil is currently defined by a PEC which is assumed to be homogeneous; however, model results show that assuming homogeneous concentrations of a toxicant in soil might lead to overestimation of risk for collembolan populations, if the actual application method is unlikely to cause homogeneous contamination. For instance, the comparison of homogeneous and heterogeneous scenarios in Fig. 3.6 shows that, while decreased in size with respect to the control, a population can still survive, even at very high concentrations. Also when using a metapopulation modelling approach (Fig. 5.6), the same results are obtained: homogeneous exposure is more harmful, and represents a worst case scenario.

The influence of spatial heterogeneity of exposure in soil is confirmed by the findings of a number of empirical studies on populations and communities of soil invertebrates. In experiments with microcosms, {{119 Salminen,J. 1996/a}}showed that soil animals populating a defaunated and patchily polluted soil area proved to concentrate in sites with the lowest pollution level. Results of a field study {{12 Salminen,J. 1999}} suggest enchytraeids may have population dynamics connected to patches (sources-sinks) caused by uneven distribution of metals, and this can mitigate the effects of metals on their population densities. Results of yet another field survey {{302 Gongalsky,K. 2009}} indicate that the patchiness of soil pollution may act as a leading factor of belowground soil invertebrate distribution.

Furthermore, as a way of improving the link between exposure and effects, and thus to obtain more precise estimates of risk, avoidance behaviour should be included as a standard endpoint, considering the great influence that its inclusion or exclusion had on model results. For instance, Fig. 5.5 shows that, as concentration and percentage of contaminated area increase, so does the difference between the outputs of simulations with and without avoidance. This suggests that ignoring whether the chemical of concern is avoided or not, may lead to over- or undestimations of risk. Furthermore, Fig. 4.4 shows that variability among replicates is much higher when the contaminant is not avoided: data on avoidance behaviour could therefore be relevant to better inform on the precision of risk estimates. The implementation of avoidance behaviour tests as screening tools in ERA has already been supported by several studies (e.g. da Luz et al., 2004; Loureiro and Nogueira, 2005). The use of avoidance behaviour of soil invertebrates as an indicator of unfavourable conditions allows a preliminary assessment of contaminated soils in a short period of time, with a high degree of sensitivity (Aldaya et al., 2006). Moreover, from an ecological point of view, avoidance is a relevant endpoint, and avoidance tests are more sensitive to within-species population differences (Aldaya et al., 2006). A combination of both types of tests could provide more detailed information on the impact of pesticides and other harmful substances on Collembola (Heupel, 2002).

Population models clearly cannot substitute for low tier short-term laboratory tests on individuals, which are necessary as a first screening tool, and also to produce the effects data necessary to parameterize the models. In fact, in order to get realistic estimates of effects of a model contaminant at the population level it is necessary to implement mechanistically significant toxicity data (e.g. concentration-response equations) at the individual or sub-individual level. An example of the importance of using accurate toxicity data in a mechanistic effect model is represented by the simulation results produced by the IBM, and especially the tests conducted within the pattern-oriented framework to verify the model (Chapter 3, Fig. 3.5). In this regard, in order to harmonize the use of laboratory tests and mechanistic effect models in risk assessment procedures, modifications to the test endpoints currently reported in ERA dossiers are recommended. In fact, it is often the case that the data necessary to incorporate toxic effects in a population model are recorded during a low-tier test, but not reported in the final dossier if this is not required, whilst the information provided (e.g. NOEC, LOEC or ECx) is not ideal in terms of model parameterization. The NOEC, LOEC and ECx as expressions of the toxicity of a chemical compound on an endpoint of interest have been already heavily criticized, for different reasons. Laskowski (1995) and Jager (2011), for instance, provide "Some good reasons to ban the use of NOEC, LOEC, ECx and related concepts in ecotoxicology". Despite these criticisms, which are based on both biological and statistical reasons, NOEC and ECx still feature prominently, not only in regulatory contexts, but also in scientific publications. Therefore, if ecological models are to be used in regulatory risk assessment, it is desirable that these measures are abandoned in favour of other solutions, such as regression analysis (Bruce and Versteeg, 1992; Stephan and Rogers, 1985)(Hope, 2005), that allow deriving the concentration-effect relationship, which can be implemented in ecological models.

Finally, the insight, provided by the mechanistic models described in Chapters 3-5, into the population dynamics of Folsomia candida and the long-term effects of copper sulphate, suggests that F. candida does not seem to be a particularly vulnerable species. For instance, EC50 values for copper sulphate on F. candida reproduction are generally within the range 500-750 mg Cu kg-1, but even when only a small percentage of clean habitat is available, populations can survive despite lethally high concentrations (e.g. Fig. 5.5). This is especially relevant in the perspective of ERA, where the species tested are chosen to represent entire groups of organisms. F. candida is widely used in risk assessment because it is generally considered to be sensitive to the effects of a wide range of chemicals and is easy to rear in the laboratory (Fountain and Hopkin, 2005). However, populations of this species are not particularly vulnerable even when they are reduced to very low numbers, as they have an exceptionally high population growth rate (Gregoire-Wibo and Snider, 1977) and reproduce parthenogenetically. Therefore, to ensure that risk assessment covers more vulnerable soil invertebrate species it may be useful to look for a substitute species. Krogh (2008) recommends, especially for chemicals that are suspected to interfere with the reproductive biology of sexually reproducing species, that another species such as Folsomia fimetaria is used in combination with, or instead of, F. candida.

6.2 Model type comparison: exercise on model aggregation

Comparison of different model types and their ability to correctly predict ecological processes has been tried for several ecological applications, especially during the past decade, when computer-based simulation models became increasingly popular. Much of the debate around the choice of model type for specific purposes has focused on individual-based versus matrix models (see e.g., (Stephens et al., 2002; Topping et al., 2005; Hilker et al., 2006; Sable and Rose, 2008)), and a series of arguments are traditionally raised in favour or against the two types of models. Among the most popular arguments in support of IBMs are the facts that the individual-based approach can simulate thousands of individuals, keeping track of their traits such as size, age, sex, and location. The equations and rules that define the behaviour of individuals in the model depend on the state of the individual itself, other nearby individuals, and environmental conditions (Grimm and Railsback, 2005). Individuals can differ from one another in their state variables, interact locally with each other, and, in spatially explicit applications, they move within the model arena (Tyler and Rose, 1994). Density-dependent growth, mortality, and reproduction emerge from the collective outcome of individual processes, rather than having to be explicitly defined a priori by the model developer (Sable and Rose, 2008). Disadvantages of IBMs are that they often require large amounts of data, need customized computer coding, and produce large amounts of multivariate output that is often hard to validate and interpret (Grimm, 1999). They are also considered less "transparent" than more traditional modelling approaches. In contrast, matrix models track the numbers of individuals in a series of age or stage classes that comprise the life cycle of the population of interest, treating individuals in the same class as identical average individuals (Caswell, 2001). Some other traditional arguments that support matrix models are that they are relatively easy to construct, make use of readily available demographic data on survival, growth, and reproductive rates, and have been widely used in ecology because they are mathematically tractable and can be easily solved numerically (Dixon et al., 1997). Equilibrium (eigenvalue) analysis of matrix models generates many useful metrics of population dynamics, such as population growth rate, stable age or stage distribution, and elasticities of life-cycle traits (Forbes et al., 2001). The disadvantages to matrix projection models are that they do not easily permit to record different conditions experienced by individuals during their life history, focus on population dynamics, thereby not allowing to implement community and food web effects, and density-dependent relationships must be defined as part of the model development (Sable and Rose, 2008).

Drawing from my experience developing the ecological models presented in this thesis, I found that, for these specific models, some of the above-mentioned arguments are not true. For instance, the types of data needed to parameterize the IBM were as easy and in some case easier to produce compared to what was needed for a full parameterization of a matrix model that included the same processes. In general one can say that IBMs require more data because they often include more complexity, whereas comparing two equally complex models, the gap in data needs is not very wide. Furthermore, the type of demographic data necessary to parameterize a matrix model are not abundantly available in the literature with a sufficient degree of detail when it comes to effects of chemicals. Also, when available, there is often a general bias towards small and short-lived species that can be cultured in a laboratory. A similar view on the parameterization of simple and complex ecological models has been reached by Topping et al. (2005). In this study they concluded that, while their IBM is very data intensive in respect to description of landscape, agronomy and wildlife behaviour, mortality, fecundity, density dependence and stochasticity are all emergent properties. The life-history model they used, on the other hand, required to specify the vital rates appropriate to the new scenario and therefore made different demands on data, not necessarily easier to fulfil.

The issue of choosing the correct model type, however, is especially important when a model is developed for regulatory purposes. In this case, the model has to be understood and run by a number of users that are not modellers themselves: therefore it is essential to understand how much complexity is necessary to answer a regulatory question and to find a trade-off between standardization and flexibility of model structure.

As reported in Chapter 2, stakeholders involved in ERA of chemicals have contradicting expectations about ecological models in support of the decision-making process. Models are supposed to be simple and user-friendly enough to be easily understood, parameterized, and used in a standardized way. At the same time, however, they should be complex enough to be realistic and capable of capturing a wide range of ecological scenarios. Thus, in order to clarify what can be expected from a specific type of model in terms of its contribution for improving estimates of risk, the costs and benefits of additional complexity for ERA procedures need to be demonstrated by contrasting simple and complex models more often than it is currently done,.

In Chapter 5, the comparison of a matrix metapopulation model and the IBM presented in Chapter 3 showed that, if the endpoint of interest is population-level effects of homogeneous soil contamination, the added complexity of the IBM is not necessary, as its predictions are very close to the simpler metapopulation model projections. Nevertheless, simulations with heterogeneous contamination showed a lack of consistency between the RAMAS model and the IBM predictions. In particular, at lower concentrations and percentages of contaminated area, the RAMAS model results are close to outcomes of the IBM with avoidance behaviour, whereas at higher concentrations and percentages of contaminated area, they are closer to IBM results without avoidance. Avoidance behaviour is not a standard, routinely measured endpoint for collembolans, and therefore for most chemicals it is not known whether they are avoided or not. This decreases the confidence in the RAMAS model predictions of population-level toxic effects, because in some cases the model overestimates (when the compound is avoided) and in other cases it underestimates (when the compound is not avoided) the risk. The flexibility of an IBM, which allows exploration of both scenarios, gives a better overview of how populations in the field are likely to be affected by a contaminant under different conditions. Furthermore, the RAMAS model was found to be less sensitive than the IBM in detecting population-level effects of different spatial patterns of exposure.

A number of other studies dealing with the issue of simplification and aggregation of complex models in various ecological applications, such as conservation biology (Akçakaya, 2000) and invasive species management (Nehrbass and Winkler, 2007), are available in the scientific literature. Findings of these studies suggest, as do the results presented in Chapter 5, that when individual variability and behavioural responses are likely to influence the outcome of a model, it is better to use an individual-based approach over a matrix one. For instance, Nehrbass and Winkler ( 2007) developed an IBM based on the same data set as a matrix model previously developed. They found that the two models had opposite outcomes in predicting the spread of an invasive plant species. The authors identified individual variability as the main cause of these results. Stephens et al. (2002) compared the ability of different matrix and individual-based model implementations of alpine marmot populations to reproduce observed behavioural responses and population abundances and variations. One of these models, a spatially-explicit individual-based model that ignores behaviour, proved to be highly unrealistic, as it predicted equilibrium densities significantly different from observed values. All models were also used to predict potential density-dependent effects on alpine marmot population growth, with very different results. The authors concluded that the simplest matrix model was adequate for making predictions regarding population sizes or densities under equilibrium conditions, but that for predictions requiring an understanding of transient dynamics only the behavioural model would be adequate. As Stephens et al. (2002) point out, how much ecology is included in an ecological model matters for some questions we want a model to answer, but not for others.

Therefore, based on the results presented in Chapter 5, I argue that the choice of model type to be used in risk assessment of chemicals should be based on the specific regulatory question, rather than on generic issues of model complexity. In fact, the study presented in Chapter 5 showed that by describing the two models following the same template makes them equally transparent and understandable, despite their different complexity and structure.

6.3 Ecological models and risk assessment

In recent years, great interest has been shown towards the use of ecological models, and several publications have promoted its use (Forbes et al., 2008)(Schmolke et al., 2010b; Schmolke et al., 2010a)(Thorbek et al., 2009).

The main advantage and selling point of using ecological models is the meaningful extrapolation of laboratory toxicity data, which are usually generated under constant and optimal conditions, to different exposure regimes, and to longer temporal scales than it is possible to test empirically. In Chapters 3 and 4, for example, I showed how it is possible to extrapolate effects of different spatial distributions of the toxicant (Fig. 3.6) and of different level of heterogeneity in the concentrations (Fig. 3.8) to the population-level under constant conditions (food and temperature). Predictions of population-level effects under a more complex exposure scenario are instead shown in Fig. 4.6, where recovery after different series of disturbance events is investigated over two consecutive years of simulations.

Furthermore, the inclusion of more ecological processes into standard ERA procedures has often been recommended (Straalen, 2003; Van den Brink, 2008), and ecological models are among the tools that are mostly mentioned as capable of achieving this goal. The ecosystems services framework has especially gained momentum in recent years as a basis for environmental management and offers promise as a valuable tool for setting meaningful ecological protection goals (Millennium Ecosystem Assessment, 2005; Nienstedt et al., 2012). Ecological models are the only practical tool currently available to link measurement endpoints, obtained from standard laboratory tests, to relevant protection goals defined within this framework (Galic et al., 2012; Forbes and Calow, 2012).

Ecological models are also very suitable for comparing outcomes of alternative scenarios (Galic et al., 2012). A way to exploit this resource within the decision-making process for pesticide authorization is to use ecological effect models as a management tool, to explore the effects of different prospective risk mitigation options. For instance, ecological models with a highly flexible structure, such as IBMs, allow testing of the effects of different scenarios in terms of number and modes of application, or different buffer zone (i.e. unsprayed) areas.

An example of such a test is presented in Chapter 4, where the IBM was used as a hypothesis testing tool to simulate long-term effects on F. candida populations of different combinations of theoretical disturbance events and patterns of spatial aggregation of the model contaminant. In this exercise, each disturbance event is assumed to represent the application of a pesticide that has a strong acute effect but short degradation time. Looking at the model results one can argue that, as a measure of risk mitigation, controlling the spatial aggregation of the resulting soil contamination is more effective that reducing the application area without controlling for spatial aggregation (Chapter 4, Fig. 6 b and c). The disturbance patterns tested in this modelling exercise are very generic, but scenarios could easily be refined to be more realistic if necessary for regulatory application.

Despite the increasing recognition of their potential, mechanistic effect models are not yet widely used in regulatory risk assessment because most stakeholders involved do not know how and when to trust such models. As reported in Chapter 2, this lack of trust is largely due to the lack of transparency in the way models are presented and, most importantly, the lack of guidance on what type of models to use for different kinds of questions. A major bottleneck in establishing trust in models is thus to provide tools for standardized testing and documentation of ecological models, following good modelling practice. Examples are the ODD (Overview, Design concepts, Details) protocol (Grimm et al., 2006;2010) and the framework for transparent and comprehensive ecological modelling (TRACE) documentation (Schmolke et al., 2010a).

The use of ecological models to support environmental decision making processes has not always been successful, as some studies demonstrate (HALL, 1988; Comiskey et al., 2004; Gross, 2005; Pilkey and Linda Pilkey-jarvis, 2007). Failures in previous attempts in using ecological models for environmental decision making are probably the result of too much reliance on predictive abilities of the models and of flawed assumptions or incorrect parameters, which the lack of transparency in model descriptions did not allow to detect. These failures have likely contributed to the current lack of confidence shown by stakeholders involved in chemicals risk assessment.

The evidence provided in Chapter 5 shows a possible way to increase trust in mechanistic effect models with regard to model choice and transparency. As every model type has its own pros and cons (Schmolke et al., 2010a), and it is impossible to combine all pros in one single model, developing and using two types of models for the same question can increase confidence in predictions of risk generated by the models and understanding of the factors that influence the system. Referring to the case-study presented in Chapter 5, the simpler model helps to verify the more complex one and can be used for more homogeneous exposure patterns. The more complex model helps to detect and understand the limitations of the simpler model and is needed to ensure ecological realism for more complex exposure scenarios.

6.4 Final conclusions and outlook

Ecological risk assessment has been going through significant changes during the last decade (Van den Brink, 2008). As human pressures on the environment increases and new and more sensitive measurement tools are developed to detect chemicals in the environment, the risk assessment of these chemicals is adopting novel methods, including ecological effect models, for estimating risks.

The present thesis, as a case study for the application of ecological effect modelling to ERA, dealt mostly with the issue of spatial heterogeneity in soil exposure for collembolan populations. Results presented in Chapters 3, 4 and 5 showed that disregarding spatial heterogeneity, as is the case in current ERA procedures for terrestrial ecosystems, may lead to an overestimation of risk if homogeneous contamination is assumed when this is not the case. More generally, these results suggest that a more realistic exposure assessment can significantly influence estimates of risk for soil organisms. As a way of improving the link between exposure and effects, and thus obtain more precise estimates of risk, avoidance behaviour should be included as a standard endpoint, considered the great influence its inclusion or exclusion had on model results. The implementation of avoidance behaviour tests as screening tools in ERA has already been supported by several studies (e.g. da Luz et al., 2004; Loureiro and Nogueira, 2005), and in combination with classical ecotoxicological tests can increase the ecological relevance of effect characterization (Aldaya et al., 2006).

Another observation that can be made from the results produced by the models I developed, and which can be of particular relevance for the risk assessment for soil invertebrates, is that Folsomia candida does not seem to be a particularly vulnerable species. This is mainly due to its fast population growth rate. Therefore it might be worth using another species in combination with F. candida, or developing ecological models not specifically parameterized for F. candida, but instead simulating a generic collembolan species with lower reproductive and growth rates than F. candida, to make sure that the more vulnerable species are actually covered by the assessment of risk.

In conclusion, time seems to be ripe for the stakeholders to get together and develop a strategy to include ecological models in ERA, as well as guidance document on how to use them, as this seems the only way to proceed (Chapter 2). Ecological models have proven to be a useful tool to extrapolate effects of chemicals from the individual- to the population-level, and to add ecological relevance to ERA; therefore it seems only reasonable to now put them to good use.