Applied Ecosystem Services, LLC

The Environmental Issues Doctor

Photo of Why Water Quality Improvement Projects Fail

Categories:

Estimated reading time: 3 minutes

Water quality matters for humans, livestock, fish and wildlife, and plants including food crops. Too often policies and regulations are ineffective while restoration projects fail to achieve intended goals. The problem is seen in environmental impact assessments, point- and nonpoint-source discharges, and Superfund sites. While some reasons for failure are project-specific, three common and easily avoided reasons are the lack of knowledge about spatial and temporal distribution of the chemical of concern, no information about the causes and amount of variability, and the focus on concentrations at a local point rather than on the entire ecosystem.

Natural ecosystems are highly dynamic; that is, they are stochastic (random) and not deterministic. This means that geochemical data need to be analyzed with models that quantify the uncertainties (randomness) in the data. Statistical models are fit to the available data and measure variability, uncertainty, predictability, and relationships.

Deterministic models describe the behavior of the data using fixed equations and require that data be fit to these invariant equations. This works in the built environment but not for ecosystems as highly variable as aquatic ones.

Every environmental observation/measurement has a specific location and time associated with it. Too often that’s not recorded and this important information is lost greatly reducing the value of the data. Seeing how geochemical concentrations are distributed over space and time gives us valuable insights into their dynamics in the water body of interest and guides policies and regulations.

Quantifying variability in geochemical concentrations allows separation of inherent changes due to season, ambient weather, and other natural causes from changes caused by anthropogenic activities such as agriculture, mining, transportation, or manufacturing. Regulatory compliance should be limited to those changes caused by human actions and not within the expected range of variability of the dynamic ecosystem itself.

Policies and permit compliance focused on point source discharges are comparatively simple, even without costly and constantly changing mixing zone analyses. Basin-wide policies (TMDLs) and compliance for nonpoint source contaminants is much more complicated and commonly leaves all sides dissatisfied with the results.

Unlike planned environmental geochemical research projects almost all available data are collected only as required by regulators. This means that sampling locations tend to be clumped (usually near urban areas and specific point source outfalls) rather than from the entire river network and times are highly variable. Some discharge permits, such as those for storm waters, may require monitoring data only once, twice, or four times per year. Baseline collection data for environmental impact assessments might be collected for a year or two at a limited number of locations. Data analysis models that do not incorporate clumped or random spatial variability and infrequent periodicity of temporal sampling yield wrong results. Policies and actions based on wrong analytical outputs are unlikely to produce desired future conditions in water quality.

It is not sufficient to consider spatial and temporal variability separately because they are integrated in natural ecosystems. Applying appropriate spatiotemporal statistical models rather than deterministic differential equation models greatly increases the value of the knowledge available to policy makers and regulators and increases the likelihood of successful pollution control and ecosystem recovery.

This work was originally published on the Applied Ecosystem Services, LLC web site at https://www.appl-ecosys.com/blog/why-water-quality-projects-fail/

It is offered under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International license. In short, you may copy and redistribute the material in any medium or format as long as you credit Dr. Richard Shepard as the author. You may not use the material for commercial purposes, and you may not distribute modified versions.

Keep reading

  1. Photo of Bringing Environmental Policy and Regulation into the 21st Century, Part 1

    Bringing Environmental Policy and Regulation into the 21st Century, Part 1

    Categories:

    Estimated reading time: 2 minutes

    After 50 years it is time to bring environmental policy and regulatory decision making into the 21st century by applying statistical paradigms that produce technically sound and legally defensible results from environmental data. When federal environmental laws were created, and agencies directed to develop regulations to ensure compliance with them, biologists and ecologists knew less about environmental systems and data analyses than we do today. Scientists had insufficient data for the wide variety of ecosystems covered by these laws, and the only statistical paradigm they knew was the null hypothesis/significance testing (NHST) approach.
  2. Photo of Why environmental data analytical results are challenged (and what to do about it)

    Why environmental data analytical results are challenged (and what to do about it)

    Categories:

    Estimated reading time: 3 minutes

    Have you missed a permit compliance monitoring or reporting event and been financially penalized? Has your environmental impact statement approval been delayed by regulators’ paralysis by analysis or by many challenges from project opponents? Has your farm or livestock operation been accused of degrading a nearby water body although you comply with discharge permit monitoring requirements? Have you suffered from the “battle of competing experts” in litigation confusing finders of fact on what your environmental data reveal about the case?

Contact me to assist you to achieve positive outcomes.