Just about every one of us was introduced to the scientific method in a middle-school science class. We were told that science happens when hypotheses about how things work are evaluated using observations and data. Facts we were told emerge when hypotheses stand up or don’t stand up to that test. And that we learned is how science is done
Of course, for the very most of us that straightforward explanation had no ready adolescent application, and the lesson like so many others was lost in the white noise of stuff that mattered at the time. Now, even for the teenagers who were later to become scientists, that early in-one-ear-and-out-the-other exposure to the scientific method was a last-time experience. Incredibly, a college education in many scientific disciplines is no longer grounded in the scientific method — the central concept underpinning the advancement humanity’s collective knowledge.
Almost 60 years ago in the esteemed journal Science, John Platt published a pithy critique of self-identified scientists who eschew the scientific method and the “strong inference” that it generates through hypothesis testing. He argued that many scientists and whole cohorts of them in certain areas of investigation don’t exercise the scientific method in pursuing their research agendas. Those science practitioners are limited by their ineffective and inefficient approaches to a lesser ability to advance knowledge in their scientific disciplines.
Platt’s admonishment is valid as ever. His paper is must-reading for everyone in the environmental sciences, as well as those who look to environmental scientists for guidance. It should be a mandatory assignment for biologists who identify themselves as conservation scientists, including those engaged in studies in the Sacramento-San Joaquin Delta and upper San Francisco Estuary. Despite the immediacy of the challenge of saving the Delta’s imperiled species, the habitats upon which they depend, and the ecosystems that support both, investigations in the Delta that confront explicitly stated hypotheses with data specifically collected for that task are rare. Accordingly, although public agencies have invested hundreds of millions of dollars in research intended to inform efforts to conserve native species, that research has failed to deliver findings that can do so.
Platt lauds scientists who develop alternative hypotheses and confront them with data as their means of advancing reliable knowledge. He points to the then-great progress in molecular biology and high-energy physics, crediting scientists in those disciplines with almost religious adherence to protocols that apply strong inference to every problem “formally and explicitly and regularly.” Cited more than four thousand times in the scientific literature in the six decades since its publication, one would think that Dr. Platt’s finger-wag at those who forego hypothesis testing might have initiated a sea change in research-design approaches in the environmental sciences. Thinking that would be wrong.
Those who follow the Delta science endeavor will find scant evidence of hypothesis testing among the mountain of investigations and analyses that have been underwritten by a flood federal, state, and local agency funding over the past two decades. A good portion of the Delta resource-management enterprise that is proposed, packaged, and funded as science, doesn’t produce strong inferences. Accordingly, some might argue that what scientists operating in the Delta are doing mostly isn’t science. Certainly, there is a lot of data collection going on. But almost none of it in the sampling schemes and experimental frameworks that are necessary to produce definitive results that can be directly converted into management actions that would have a high likelihood of contributing to recovering the Delta’s declining natural resources. The biologists chasing a better understanding of the environmental needs of imperiled fishes in the Sacramento and San Joaquin rivers, their tributaries, and the San Francisco Estuary need to do better than just counting things and measuring stuff — much better.
The Delta Science Program was established to support implementation of the Delta Plan and “to provide scientific information and syntheses for the state of scientific knowledge on issues critical for managing the Bay-Delta system. This knowledge must be unbiased, relevant, authoritative, integrated across agencies, and communicated to stakeholders.” The program is supposed to promote “adaptive management and the use of best available science.” Worthy objectives, surely. But more than two decades after its initiation under the CalFed experiment, the program can offer little proof that its strategy for developing and communicating critical information to resource managers and a concerned public is working. The Delta Science Program’s inability to encourage scientifically robust approaches in the state and federal resource agencies’ research, modeling, and monitoring efforts is well-documented. The ultimate proof, of course, is the continued decline of fishes listed under the federal and state Endangered Species Acts that reside in or migrate through the embattled Delta. But, with the recent appearance of a Science Action Agenda, the program’s second five-year plan, one might have hoped that the ineffective trajectory of Delta science would take a turn.
Unfortunately, although the Science Action Agenda is subtitled A Vision for Integrating Delta Science, the 2022-2026 action agenda offers few insights regarding the actual data collection, analyses, and interpretation that can result in reliable knowledge useful to Delta resource managers and policy makers. The document is rich with platitudes about a science engagement that involves “planning, implementation, and reporting” serving as “a foundation for achieving the vision of One Delta, One Science – an open Delta science community that works together to build a common body of scientific knowledge to inform management.” What can that possibly mean when the science community doesn’t follow the time-tested procedures that practicing scientists should be expected to employ? The term hypothesis testing is not to be found in the new document. Nor is a definition of adaptive resource management. Nor acknowledgement that targeted and effective resource monitoring is the foundational scientific activity in any conservation planning effort intending to recover imperiled species and their habitats.
The centerpiece of the new report and its ostensible step forward from the previous and first Science Action Agenda is a program initiative that took 1279 management questions from an “iterative, collaborative process” culling them down to 65 management questions distributed across 6 management needs categories. That set the stage for the selection of a top 25 Science Actions. The exercise was part popularity contest, part effort to mollify stakeholders concerned that the multi-agency resource management has not succeeded in reversing the declines of protected fishes. While an extensive list of management questions might seem like progress of a sort, it really isn’t. There is little to suggest that the Delta Science Program recognizes the need to convert management questions into testable management-relevant hypotheses, then translate the hypotheses into requisite sampling schemes and ecological modeling tools, and then analyze data and generate predictions that meet current standards of quantitative scientific practice.
Management questions are cheap to come by and frequently not particularly helpful. They inevitably encourage an unbounded and weakly directed research agenda. One that might resolve certain uncertainties, but rarely those that most bedevil resource managers and policy makers who need sharp answers to guide effective, efficient, and accountable resource management and environmental restoration programs. Responses to a management question may narrow guesses about what the Delta’s fishes and their habitats need from a directed conservation effort, but little more. In contrast, efforts are necessary to evaluate management-relevant hypotheses using data-collection schemes explicitly designed to disprove, if possible, our best ideas of how the Delta’s damaged aquatic ecosystems can be restored and then sustained.
John Platt describes the “scientific” approach to producing reliable knowledge. An approach that Delta scientists too often eschew. They do so mostly with the argument that hypothesis-testing is just too hard to do. But it is not.
Achieving strong inference to guide resource management should not perforce remain elusive. Dr. Platt offers straightforward steps that agency science staff and their consultants can readily engage in —
1) Devising alternative hypotheses
2) Devising a crucial experiment (or several of them) with alternative possible outcomes, each of which will, as nearly as possible, exclude one or more of the hypotheses
3) Carrying out the experiment so to get a clean result
4) Recycling the procedure, making sub-hypotheses or sequential hypotheses to refine the possibilities that remain, and so on.
Those are Platt’s steps verbatim. You may say, well, that sounds like science at the laboratory bench, not science constrained by the opaque waters of the Delta. And you might observe that in the Delta scientists struggle to get the flimsiest control and replication into their data collection schemes, much less carry forth a sampling design that concurrently or consecutively tests multiple hypotheses. But to suggest that investigations that produce strong inference can’t be done in the Delta, without any serious attempt to do so, is unacceptable. Nor should those concerned with the health and ecological integrity of the Delta accept arguments put forward by four past Lead Scientists operating under the Delta Science Program, who describe the environmental challenges in the Delta as “wicked problems,” too manifold to be overcome with input from “traditional” scientific approaches. The Delta system they declare is too “complex, chaotic, or cantankerous” to be understood, implicitly suggesting that the lack of useful findings applicable to conservation planning from hundreds of millions spent on research, modeling, and monitoring should be excused for those alliterative reasons. Say the Lead Scientists “Multiple interacting factors affect the well-being of native species. Some of these factors are well understood, but their interactions and cumulative consequences are not, making it impossible to make definitive statements about what is causing native species to decline.”
That’s not so. Data collection in service of hypothesis testing has not been de rigueur in the Delta science endeavor. Consequently, three decades after the delta smelt was listed under the federal Endangered Species Act, there is raging disagreement over the environmental factors responsible for the fish’s apparent steady slide towards extinction. For that matter, absent an adaptive management agenda and directed monitoring implemented in an experimental framework, there is no reliable estimate of how many delta smelt remain in the upper San Francisco Estuary, where they are successfully reproducing, and what habitat conditions should be restored to sustain the embattled species into the future. Delta scientists have not been able to make “definitive statements” about the declines in the Delta’s fishes because they don’t practice the scientific pursuits that lead to definitive findings.
In conservation science there are, of course, all manner of scientific pursuits beyond efforts to evaluate management-relevant hypotheses – modeling ecosystem structure and function, surveys to determine the distribution of species and their individual habitats, establishing species-habitat relationships, validating indicators, surrogates, and proxies for scarce and hard to sample species, and studies of metapopulation dynamics among them. All those efforts and more can add depth and breadth to a science program. But a science program that eschews hypothesis testing will lack the foundational information that makes up the body of knowledge needed to meet Congress’ best available science directive in the federal Endangered Species Act. Designing monitoring schemes and research projects in experimental frames that treat conservation (management) actions as hypotheses is fundamental to implementing adaptive resource management as the Delta Reform Act and the stakeholder Collaborative Science and Adaptive Management Program require.
The state resource agencies engaged in the Delta are not treating their prescribed management actions as on-the-ground manifestations of hypotheses about how the ecology of the Delta operates. If they were, each management action prescribed under the authority of California’s Endangered Species Act – the state’s Incidental Take Permit — would include specific performance criteria and be accompanied by an experimental design and a data collection effort capable of differentiating between successes in meeting management objectives and failing to do so. Rather than generating a new list of management questions and attending Science Actions, the Science Action Agenda should address the management hypotheses posed as management actions recently instituted by the ITP and previously under other regulatory authorities. A variety of management actions (both new and old) have been implemented at great cost to the California public, yet no serious effort has been made to evaluate if expected benefits to the Delta’s at-risk species are being realized.
A science program that is fueled by a collection of popular management questions is doomed to produce mostly educated guesses and non-scientific storytelling. The Delta Science Program should revisit the still-fresh Science Action Agenda and add a section to the document that advances the requisite hypothesis-testing schemes that the resource management agencies, both state and federal, have long neglected. Reopen the document for changes? Why not? The document should be evergreen, dynamic, continuously growing and moving forward with new information and reflecting emerging investigative tools and technologies. Even targeted management actions that are based on the best available scientific information will in time fail to meet expectations under changing environmental conditions and will need to be adjusted or retired and replaced. That’s the intent of adaptive resource management. That demands hypothesis testing. And that’s what the Delta Science Program needs to facilitate, and the Science Action Agenda needs to address.
No Comments