After 50 years it is time to bring environmental policy and regulatory decision making into the 21st century by applying statistical paradigms that produce technically sound and legally defensible results from environmental data. When the Clean Water Act, Endangered Species Act, and National Environmental Policy Act were created, and federal agencies directed to develop regulations to ensure compliance with them, biologists and ecologists knew less about environmental systems and data analyses than we do today. The federal scientists had insufficient data for the wide variety of ecosystems covered by these statutes, and the only statistical paradigm they knew was the null hypothesis/significance testing (NHST) approach that is still the only one taught in basic statistics courses. Unfortunately, this frequentist paradigm is rarely appropriate for environmental data. The mathematical, logical, and philosophical problems with the frequentist paradigm have been argued by statisticians for about 80 years. Regulators, regulated companies, and the consultants and attorneys who assist them have better paradigms available for analyzing environmental data. Understanding why the frequentist paradigm fails environmental data is necessary to recognize the benefits to you by applying alternative approaches to your data.