Does evidence quality complacency hamper conservation?

The problems of evidence complacency are beginning to be recognised. Just as important are the problems of evidence quality complacency, as claims of wildlife non-impact in planning and subsequent conservation policy success rely on data often incorrectly gathered and reported. Transparent methods and data are needed to test what could be ‘alternative facts’ or ‘fake news’.

Evidence awareness and trust

With the growth of peer-reviewed evidence to support conservation-based decision making, it has been suggested by Sutherland and Wordley that its uptake is being hampered by a form of complacency.1 Many conservation management decisions still seem to be based on options and methods for which there was little evidential support, or using highly promoted ‘answers’ when they were unsupported by evidence.1

There is a call1 for the conservation community to make consideration of evidence part of the professional norm in conservation actions. The study by Sutherland and Wordley drew a line at examining the material collected, and used, as the basis for such evidence. It is presumed that the data behind these studies are robust and fit-for-purpose. If they are not necessarily as reliable as claimed, should this also be considered as part of the wider complacency problem: acceptance of claimed data quality without challenge? This appears to be especially a problem in material collected for planned developments in the UK.2 With the UK Government, amongst others, calling for net biodiversity gain in planning submissions3, which necessarily requires a robust and reliable reference baseline for comparisons, it is critical that the material used to support claims does not suffer from evidence quality complacency: assuming the data are fit-for purpose without checking them.

What sorts of data are being collected for planning in the UK?

In the UK, the majority of planning applications need an evaluation of their potential impacts on the immediate and adjacent wildlife, and potential cumulative impacts associated with the application and others in the vicinity.4 Most applications involve initial desk and site survey data collection, and are normally undertaken by developers with either in-house ecologists, or occasionally out-sourcing to specialist groups.

As the data are collected by professionals, it is expected5(BSI 2013) that they will be of high quality, and collected and reported in line with the methods and standards that they state they have used. The data and methods are typically reported in chapters of Environmental Statements (ES) that accompany planning applications, and should include both the data and state any limitations to those data4,5, so that any third party can evaluate them and assess their suitability for use, and the arguments that they might or might not support.

Why should there be evidence complacency?

Ecological documents presented as part of an Environmental Statement (ES) typically run through several drafts before submission. Each has a lead author, and the text is verified and signed off by a senior ecologist within that practice. This process is expected to pick up faults in explanation, logic, or detail that might be queried, or indicate issues that might lead to later problems in a planning submission. The result is presumed to be high quality data suitable for inclusion in an ES and their use in planning determination.4,6 Believing this without evaluation of the material in detail introduces the risk of evidence complacency.

There are several problems attached to this presumption. The first is that the application of the stated methods was carried out correctly, and reported correctly. The second is that the surveyors were able to carry out the work to the appropriate level: that they had suitable skills and experience. The third is that any limitations to the work: desk or field survey, are reported and their implications clearly set out. The fourth that is that the sign-off process actually involves a critical assessment of the documents, and the claims being made there, so that sign-off has credible value. The fifth is that uncritical acceptance of the material produced is justified.

The generation of reliable data sets

For most taxa in the UK there is a set of standard methods to be used for surveys.7 Group or species-specific guidelines8,9,10,11,12 set out the details of methods to be used, including timing within the year and day, frequency of visits, effects of weather conditions, minimum number of visits, and limitations that affects surveys if these are not followed. This places a clear onus on surveyors to report what they did, and to note any problems as a result. It is also expected that data will be made available to interested parties in the ES and its appendices, or on request.4,5 Variability in weather, access or logistics often means that the context of the actual surveys is a little more complex than the ideal. This has implications for interpreting the data and its reliability.

Several studies13,14,15 looked at the reliability of Phase 1 habitat surveys.16 One of these13 asked professional ecologists to report on the reliability of Phase 1 surveys carried out by themselves or their peers. The results suggested a high degree of variability in survey skills and data quality, and associated problems with relying on the Phase 1 data produced as a suitable reference baseline. Others15,17 also reported problems in the spatial repeatability of habitat mapping boundaries and categories in habitat and vegetation surveys. Respondents13 noted that many of the discovered faulty data sets needed replacement, presumably by more reliable surveyors, before data could be used. Most of the errors or problems reported were attributed to inexperienced surveyors: exactly the group typically hired for seasonal work, and requiring supervision and data quality assessments in ES reports.6  

The use of best-practice guidelines means that there is implicit room for minor variability according to site circumstances. This is usually covered under the blanket term ‘professional judgement’ 2,4,5, 5,9 but requires substantiation and validation.4 This judgement, frequently rephrased as expert opinion, may include terms such as ‘based on’ or ‘in line with’ the claimed guidance, without offering explanation of quite how, or why, it varied from the standard cited.Any methodological or practical problems should be reported in an ES under the heading of limitations.4,18,19 They are key to understanding the robustness of the data, yet few studies report these, or dismiss them as unimportant: most without proof or validation2,19, so that claims are hard to support. This means that there is often an unquantified level of uncertainty attached to the data in an ES and to their interpretation.

Whether the issues associated with surveys or their reporting are the product of complacency, or contractual, or other pressures,19,20 should be picked up in the internal document peer review process and removed by senior, more experienced staff. It is clear from looking at published ES documents that many problems pass or bypass the scrutiny process2, so that they appear into the public domain as validated. That many of the senior staff reviewing and passing drafts as validators are also registered as professional ecologists with professional bodies adds to the apparent sense of reliability, and consequent assumptions of complacency in planners. For planners, having an EIA “gave planners added confidence that their considerations of the proposals were well informed”. 21

What are the consequences of quality complacency?

The authors of a review of evidence complacency1 were concerned about the poor uptake of evidence-based conservation practices, despite their availability, and thought it would lead to poor management outcomes. This passivity was described as complacency on behalf of practitioners. But, what if limited uptake were due to it being clear that the evidence that formed the basis for some of these practices were questionable, might their non-use be a reasonable response by practitioners?  Essentially this might be an active response. Without checking, passivity and activity outcomes look the same. Being able to make an informed decision requires knowing a lot about the data and their context.

When data are put into the public domain through an ES, there is an assumption that they provide a secure baseline for planning purposes and for delivering government aspirations, such as wildlife net-gain by means of mitigation and positive management, if the proposal is approved. If there are poor controls on the use of methods, partial or missing explanations of limitations, and there are problems associated with survey data, there is little basis for using such data as a secure baseline to judge the extent of possible change.13

Being unable to rely unquestioningly on professionally gathered data, especially in a period in the UK when planning authorities employ fewer and fewer ecologists4, places planners in a conundrum. Examining an ES for possible errors takes a lot of time and experience: this is rarely available to planning authorities. It also places an onus on the planners to call out faults, when the professional surveyors and the internal scrutiny processes should do this as part of the survey design, implementation, review, and assessment process of the ES for any potential development.

If the basic methods and data in the ES are a problem, it means that the tools that spin off from such data, such as offsets, ‘no net loss’ or ‘net gain’ are equally at risk.22, 23 Being able to demonstrate these is far from trivial.18,24 Yet, for some, it is the ‘spirit’ of the approach that matters 25, even if this is unquantifiable and sidesteps the issue and transparent delivery. 

Being clear what is being offset, what ‘no net loss’ or ‘net gain’ might look like is dependent on reliable baselines.26,27 If these are less robust than generally presumed, then there is a basic issue of unwarranted complacency.

Solutions to quality complacency

If, as I suggest, challenging quality complacency is an important issue, then it will potentially have major effects at a site level. It will no longer be safe to accept unwarranted claims of no potential impact, and there will be uncertainty about evaluating nationwide policy tools such as ‘net gain’.3 Changing this is not a simple issue.

The first step must be the open and clear statement of field methods and linked limitations4,5 from any site without defensiveness on the part of the potential developer. For the UK, most of the professionals involved in site surveys and assessments will be members of CIEEM (Chartered Institute of Ecology and Environmental Management), so closer adherence to CIEEM guidance and code of conduct28 should help deliver this. A similar approach would work in other jurisdictions and professional associations.


Photo: Gavin Saunders

The second step is increased transparency: easy access to data is fundamental4,19,28, yet is often difficult, and poorly met in practice, with data availability not uncommonly delayed until close to or even after planning determination has taken place.29 The promotion of biodiversity net gain guidance by professional institutes such as CIEEM and IEMA (Institute of Environmental Management and Assessment) and construction fora such as CIRIA30 should be accompanied by clear statements on data access, and their provision, as a matter of course. That a number of studies suggested pressures exist to slow down provision gives a hint of the challenge in practice.16,20

A third step, if data access and quality are problems, may be the possibility of professional sanction drawing on a wide-scale review of documentation used in planning cases. With staff shortfalls in planning authorities18 there may be a potentially toxic combination of inadequate resources (too few staff, too many pages of planning documents) and evidence quality complacency (assumed adequacy in reports produced by registered professionals) in play. Under the current UK planning system, it is up to the planners, or those concerned about the suitability of a planning application, to call out data issues and then to formally register concerns. This effectively adversarial system2, calling for specialist input from the reviewers to identify issues and problems, is not consistent with the resource bases available in many cases. CIEEM currently hears, and decides on complaints of poor professional practice.19 These are necessarily a small sample of potential cases, and are self-selecting. If groups such as CIEEM or IEMA were to regularly select a random sample of planning applications requiring ES submissions within the UK on a structured basis (covering geographies and types on a rolling basis), and critically examine these, then the likelihood of complacency in data submissions would be reduced. Without this, the risk of negative reputational exposure is a minimal problem. By inserting effective scrutiny, coupled with potential professional sanction, the risk of reputational damage might be expected to tighten quality standards; reducing poor quality data, and increasing the possibility of realistically assessing the viability of practical and policy tools which rely on such data.

Clearly, if there were faith in the data behind baseline data sets in planning and tools such as offsets or ‘net gain’, then there might also be increased uptake in conservation evidence practices too. Until the data behind policy tools can be relied on, then the wider family of evidence-based data is also at risk. Complacency in data quality, as well as conservation evidence, will not improve decision making or outcome evaluation for practical benefit. The biggest risk is that claims will still go unchecked and unevaluated: ‘alternative facts’ 1 or ‘fake news’ may have as much currency as real outcomes, and cannot be detected or rebuffed. That is not a suitable basis for complacency.

References

  1. Sutherland, W.J & Wordley, C. (2017) Evidence complacency hampers conservation. Nature ecology and evolution. DOI: 10.1038/s41559-017-0244-1
  2. Reed, T. M. (2017) Data reliability, data provision, professional judgment and assessing impact assessment for planning purposes. In Practice 95: 43-48.
  3. Defra (2017) EIA regulations (www.legislation.gov.uk/uksi/2017/571/contents/made)
  4. CIEEM (2016) Guidelines for ecological impact assessment in the UK and Ireland. Winchester, CIEEM.
  5. BSI (2013) BS 42020:2013 Biodiversity. Code of practice for planning and development. BSI, London.
  6. CIEEM (2015) Guidelines for ecological report writing. Winchester, CIEEM.
  7. Hill, D, Fasham, M., Tucker, G., Shewry, M. & Shaw, P. (2005) Handbook of biodiversity methods. CUP, Cambridge.
  8. Hundt, L. (2012) Bat surveys: Good Practice Guidelines, 2nd edition. London, BCT.
  9. Collins, J. (2016) Bat surveys for professional ecologists: good practice guidelines. London, BCT.
  10. Gilbert, G., Gibbons, D.W. & Evans, J. (1998) Bird monitoring methods. Sandy, RSPB.
  11. Strachan, R., Moorhouse, T. & Gelling, M. (2011) Water Vole Conservation Handbook (3rd edition). Wildlife Conservation Research Unit, University of Oxford
  12. Dean, M., Strachan, R., Gow, D & Andrews, R. (2016) The Water Vole Mitigation Handbook. Southampton, Mammal Society.
  13. Cherrill, A. (2016) Inter-observer variation in habitat survey data: investigating the consequences of professional practice. J. Env Plann & Mgt., DOI 10.1080/09640568.2015.1090961
  14. Cherrill, A. (2013) Repeatability of vegetation mapping using Phase 1 and NVC approaches. In Practice 81:41-45.
  15. Hearn, S. M., Healey, J. R., McDonald, M. A., Turner, A. J., Wong, J. L. G., & Stewart, G. B. (2011) The repeatability of vegetation classification and mapping. Journal of Environmental Management, 92: 1174-1184.
  16. JNCC (2010) Handbook for Phase 1 habitat survey. Peterborough, JNCC.
  17. Cherrill, A. & McLean, C. (2001) Omission and commission errors in the field mapping of boundary features. J. Env Planning & Management 44:331-343.
  18. CIEEM (2016a) Pragmatism, proportionality, and professional judgement. In Practice 91: 57-60.
  19. Thompson, D., Graves, R., Hayns, S. & Alexander, D. (2016) In practice, 91: 63-66.
  20. Ray, J.C. (2016) Submission to the Expert Panel for the Review of Environmental Assessment Processes. http://eareview-examenee.ca/wp-content/uploads/uploaded_files/ea-expert-panel-submission_ray_wcscanada_23dec2016.pdf
  21. Jay, S., Jones, C., Slinn, P & Wood, C. (2007) Environmental impact assessment: retrospect and prospect. Environmental Impact Assessment Review DOI: 10.1016/j.eiar.2006.12.00123
  22. IEEP. (2014) Evaluation of the biodiversity offsetting programme. Evaluation of the biodiversity offsetting programme Final Report. Collingwood & IEEP, London.
  23. Bull, JW, Gordon A, Law E, Suttle K, & MilnerGulland E. (2014) Importance of baseline specification in evaluating conservation interventions and achieving no net loss of biodiversity. Conservation Biology 28: 799–809.
  24. Bull, J.W. & Brownlie S. (2015) The transition from no net loss to a net gain of biodiversity is far from trivial. Oryx 51: 53–59.
  25. Bull, J. W., K. B. Suttle, A. Gordon, N. J. Singh, & E. J. Milner-Gulland. (2013) Biodiversity offsets in theory and practice. Oryx 47:369–380.
  26. TBC (2016) Protected areas and IFC performance Standard 6. TBC, Cambridge
  27. Spurgeon, J. (2017) Net impact: ten things all businesses should know. http://naturalcapitalcoalition.org/net-impact-ten-things-all-businesses-should-know/
  28. CIEEM. (2013). Code of conduct. Winchester, CIEEM.
  29. Cambridgeshire County Council (2017) Chesterton Bridge Water Vole Report. C/5005/16/CC. Cambridge, CCC.
  30. CIEEM & IEMA. (2017) Biodiversity net gain, good practice principles for development. https://www.cieem.net/biodiversity-net-gain-principles-and-guidance-for-uk-construction-and-developments

 

TIMOTHY REED

Director of Tim Reed Ecological Consultants Ltd              

Contact the author

Cite:

Reed, Timothy “Does evidence quality complacency hamper conservation?” ECOS vol. 39(1) 2018, British Association of Nature Conservationists, www.ecos.org.uk/does-evidence-quality-complacency-hamper-conservation/.

Leave a Reply