The Scientific Organization: Organizing U.S. Climate Modeling (2)
The Scientific Organization: Organizing U.S. Climate Modeling (2)
There are a few open themes in these blog posts that need attention – and a couple that I intend to fit together. In this entry I want to return to some of the issues raised in Something New in the Past Decade?, which looked at an old report on the organization of U.S. climate modeling and high performance computing. One motivation for returning to this old report is an ongoing panel study to write a new report about “A National Strategy for Advancing Climate Modeling.” (link)
Over the past 25 years there have been many reports written about climate and weather models (example), climate and weather observing systems (example), high performance computing (example), and how to transition efforts from research to operations (example). If you look into these reports and their conclusions a number of common themes emerge. First, the presence of these reports suggests that there is a long-held perception that U.S. activities in climate science are not as effective as they need to be or could be. The reports consistently conclude with recognition of the creativity and quality of our scientific research, followed by calls for more integration across the federal agencies. In my earlier entry I argued that anytime there is a push towards more integration of research, there is both individual and institutional resistance.
This resistance occurs for many reasons, both good and bad, both structural and cultural. I want to focus on those reasons that appeal to the sanctity of “the science.” These arguments are often based on the notion of creativity and innovation and that creativity and innovation cannot be managed. Further arguments rely on the observation that many innovations come from unexpected places, and therefore, cannot be anticipated. Therefore, the creative edge of science needs to be left unencumbered by the heavy hand of management needed to assure integration.
Another notion enters into the argument - that is to comply with the standards required to integrate component pieces into a whole hurts the integrity of “the science.” There are two lines that support this. The first line focuses on examples of when attention was directed towards, say, information systems or technology and a product of dubious scientific integrity was produced. The second line is that by the time a particular component, say the algorithm that calculates the rain from thunderstorms, is integrated into an operational weather or climate model that algorithm is no longer state-of-the art. Therefore operational or production models are always a step behind the best science.
These arguments, which have merit, serve to benefit the dominate type of scientific efforts in the U.S. These are the efforts associated with individual scientists, who focus (or reduce) their problems in such a way to isolate something specific and to determine cause and effect. This reductionist approach to investigation is central to the classic scientific method, which has proven to be a very effective method of discovery. The focus on reduction comes at the expense of the path of science that comes from – how do all of the pieces fit together? That is the integrating or unifying path of science. This unifying path requires a synthesis of knowledge. This synthesis does, in fact, lead to new discoveries because when the pieces do not fit together, then we are required to ask – why not? The synthesis of scientific knowledge is also required to, for example, forecast the weather or climate or to adapt to sea level rise.
My ultimate thesis is that a focus on integrated or unified science does not come at the expense of “the science,” and does not undermine the scientific method or the integrity of “the science.”
There are several elements of the scientific method. At the center of it all is testing and checking. In a good scientific paper, most of the text is spent describing the results and how those results were determined to be correct, or at least, convincingly defended. A scrupulous reader looks for independence in the testing and validation; that is, how is unbiased information brought into the research to evaluate the results. Then the paper is subjected to peer review, which is another form of verification. Once a paper is published, it becomes fair game for all to question, and there is, ultimately, a requirement that the result be verified by independent investigation. If the result cannot be reproduced, then there is little acceptance of the result as correct (see Wikipedia Cold Fusion).
This process of checking is ingrained into scientists, and those who develop a sustaining legacy as quality researchers are always expert on how to check results in multiple ways. It is also true that on the individual level, it is ingrained into the scientist to question the results of others. Therefore, at the level of the individual, there is a built in process that does not promote synthesis, integration, or unification. Quite the contrary, what is promoted is the creation of many high quality nuggets of knowledge. These nuggets may or may not fit together to form a consistent body of knowledge.
Returning to the beginning of this article, one message from report after report is the need for the integration of the efforts of climate science to meet the broader needs of the community. This is true for physical climate, where there is the need for integration of knowledge to provide predictive models for assessment of climate change. And, as those who decide to use the information from these models try to make decisions for their investments and their projects, there is a need for the integration of this information with many other sources of information – for example, how big does my drainage pipe need to be? How high should my levee be?
The reports call for better integration, but at the very basis of the culture of research and the use of the scientific method, we value most the rugged individualism of skepticism. How then is integration of research to address societal goals achieved?
If it were easy, if were simply a matter of making sure that all of the right pieces were built, then we would not have 25 years of reports with a cadence of “need more integration.” Perhaps the obvious answer - there needs to be a process or an organization that as a whole honors the principles of the scientific method. This requires, then, a process that builds trust among the individuals of the organization. It requires structuring of checking and validation in a form that supports the transfer of knowledge (and computer code) from one individual to another. It requires the development of validation strategies that test the combined knowledge, the combined algorithms, in a quantitative and repeatable way. This organization is far different than an organization that is comprised on many, individual, excellent scientists. Next, thinking about the scientific organization that we need.
Figure 1: Chaos and order, 2008 Galvanized wire, 60x60x60cm. Barbara Licha, Finalist of Willoughby Sculpture Prize 2009. (from Ultimo Project Studios)