Examples

Flexible Modeling System

Flexible Modeling System (FMS) is a software framework developed by the Geophysical Fluid Dynamics Laboratory (GFDL), which is an NOAA climate modeling center at Princeton. It supports the efficient development, construction, execution, and scientific interpretation of atmospheric, oceanic, and climate system models. It is an outgrowth of the MOM family of climatic models, with the latest one - MOM4. The goal is to provide the international climate research community with a repository for robust and well-documented methods to simulate the ocean climate system. Researchers are invited to support the existing modules and provide various modules that are absent from MOM4, yet may enhance the simulation integrity (e.g., a new physical parametrization or new advection scheme) or increase the model's functionality. FMS comprises the following:

1. A software infrastructure for constructing and running models. This infrastructure includes software to handle parallelization, input and output, data exchange between various model grids, orchestration of the time stepping, makefiles, and simple sample run scripts. This infrastructure should largely insulate FMS users from machine-specific details.

2. A standardization of the interfaces between various component models.

3. Software for standardizing, coordinating, and improving diagnostic calculations of FMS-based models, and input data preparation for such models. Common preprocessing and postprocessing software are included to the extent that the needed functionality cannot be adequately provided by available third-party software.

4. Contributed component models that are subjected to a rigorous software quality review and improvement process.

5. A standardized technique for version control and dissemination of the software and documentation.

FMS is a software framework. The FMS developers make it clear that their system does not include the determination ofmodel configurations, parameter settings, or the choice of modules. The development of new component models is a scientific concern that is outside of the direct purview of

FMS. Nonetheless, infrastructural changes to enable such developments are within the scope of FMS. The collaborative software review process of contributed models is therefore an essential facet of FMS. The quality review and improvement process includes consideration of (1) compliance with FMS interface and documentation standards to ensure portability and inter-operability, (2) understandability (clarity and consistency of documentation, comments, interfaces, and code), and (3) general computational efficiency without algorithmic changes.

As a software framework, it has certain clear requirements that contributed code must meet:

1. Clean modular Fortran 90 code that minimally touches other parts of the model.

2. Satisfaction of the FMS code specifications outlined in the FMS Developers' Manual.

3. Compatibility with the MOM4 test cases.

4. Thorough and pedagogical documentation of the module.

5. Comments within the code emulating other parts of the model.

Modular Modeling System

Modular Modeling System (MMS) was developed by G. Leavesley and his colleagues in US Geological Survey (USGS). It is described as a framework for modeling that can be used to develop, support, and apply any dynamic model, but specifically it is focused in the environmental and natural-resource management disciplines. MMS uses a module library that contains modules for simulating a variety of physical processes. These are primarily water, energy, chemical, and biological processes. A model is created by selectively coupling appropriate modules from the library to create a suitable model for a desired application. When existing modules do not provide appropriate process algorithms, new modules can be developed.

MMS is an integrated system of computer software developed to (1) provide the research and operational framework needed to enhance development, testing, and evaluation of physical-process algorithms; (2) facilitate integration of user-selected algorithms into operational physical-process models; and (3) provide a common framework in which to apply historic or new models and analyze their results.

This framework facilitates multidisciplinary research and operational efforts. Researchers in a variety of disciplines can develop and test model components in their own areas of expertise and combine these modules with those of the other researchers to develop a complete system model. In addition, as research provides improved model components, these can be used to modify or enhance existing operational models by inserting or replacing process modules.

The conceptual framework for MMS has three major components: preprocess, model, and postprocess. A system supervisor, in the form of a window-based Graphical User Interface (GUI), provides user access to all the components and features of MMS. There are versions that work under UNIX and Windows operating systems. The GUI provides an interactive environment for users to access model-component features, apply selected options, and graphically display simulation and analysis results.

The 'preprocess component' includes the tools used to input, analyze, and prepare spatial and time-series data for use in model applications. A goal in the development of the preprocess component is to take advantage of the wide variety of existing data preparation and analysis tools and to provide the ability to add new tools as they become available. The time-series and other data that are needed to run the model have to be prepared as a single flat ASCII file. Procedures are being developed to interface models with a variety of commercial and user-defined databases, such as SQL type databases (Oracle and Ingres) and for the HEC-DSS database. NetCDF is another data format that is supported. These are being used in real time applications with the Bureau of Reclamation and the Natural Resources Conservation Service.

The 'model component' is the core of the system and includes the tools to selectively link process modules from the module library to build a model and to interact with this model to perform a variety of simulation and analysis tasks. The module library contains a variety of compatible modules for simulating water, energy, and biogeochemical processes. Several modules for a given process may be present, each representing an alternative conceptualization or approach to simulating that process. A module can be written in either the FORTRAN or C programming language.

Modules are located in both read-only directories, where tested, documented, and approved code reside, and in user-defined work directories where new modules are being developed. The user selects and links modules from these directories to create a specific model using an interactive, graphical, model-builder tool (XMBUILD). Modules are linked by coupling the outputs of user-selected modules to the required inputs of other user-selected modules. Tools are provided to display a module's input requirements and to list all modules available that will satisfy each ofthese inputs. When the inputs for all modules are satisfied, a model is complete. Once a model has been built, it may be saved for future use without repeating the XMBUILD step.

When a model is executed, the user is interfaced with the model through a series of pull-down menus in the GUI, which provide the links to a variety of system features. These include the ability to (1) select and edit parameter files and data files; (2) select a number of model execution options such as a basic run, an optimization run, or a sensitivity analysis run; and (3) select a variety of statistical and graphical analyses of simulation output. During a basic run, up to four graphical display windows can be opened to display any of the variables that have been declared in the model modules. As many as ten variables can be displayed in each window and plotted results can be output in HPGL or PostScript formats either to a digital file or to a printer.

The 'postprocess' component provides a number of tools to display and analyze model results, and to pass results to management models or other types of software. Model output can also be directed to user-specific analysis programs using an ASCII flat-file format. Some postprocessing capabilities interact directly with the model component. The parameter-optimization and sensitivity-analysis tools are provided to optimize selected model parameters and evaluate the extent to which uncertainty in model parameters affects uncertainty in simulation results. A geographic information system (GIS) interface is developed to provide tools for the analysis and manipulation of spatial data in the preprocess, model, and postprocess components of MMS. Pre- and postprocessing interfaces are being developed as generic interfaces to support a variety of applications, such as the Arc/Info GIS package. Another candidate support package is the Geospatial Library for Environmental Modeling (GEOLEM).

While the GUI is targeted at a broad range of model users, any updates of the system and additions of new modules or interfaces require good programming skills.

Library of Hydro-Ecological Modules

The Library of Hydro-Ecological Modules (LHEM) at the Institute for Ecological Economics, University of Vermont (UVM), offers a somewhat different approach when modules are developed and archived at the level of an icon-based system such as STELLA. This offers a lot of transparency, and extendibility, especially if modules are properly documented. To a certain extent the system can be used and expanded by users with no or very basic programming experience. STELLA provides a user-friendly GUI that allows module construction with no need to do computer programming. STELLS itself has very support for modularity. There are no formal mechanisms that could put individual STELLA models together and provide their integration. Stella does allow submodels or sectors within the context of a larger model, allowing each sector to be run independently of the others, or in any combination. However, there is no easy way that a sector can be replaced, or moved from one model into another. To bypass these restrictions of STELLA the Spatial Modeling Environment (SME) is used. SME can take individual STELLA models and translate them into a format that supports modularity. In addition to STELLA modules, SME can also incorporate user-coded modules for those components that are more complex and would be impossible to handle in STELLA, for example, various spatial fluxes in a watershed or a landscape.

By using SME a general modular framework was developed, which defines the set of basic variables and connections between the modules. Particular implementations of modules are flexible and assume a wide variety of components that are to be made available through libraries of modules. The modules are formulated as stand-alone STELLA models that can be developed, tested, and used independently. However, they can share certain variables that are the same in different modules, using a convention that is defined and supported in the library specification table. When modules are developed and run independently, these variables are specified by user-defined constants, graphics, or time series. Within the SME context these variables get updated in other modules to create a truly dynamic interaction.

For spatial dynamics or other sophisticated system features, modules are formulated in C++. They use some of the SME classes to get access to the spatial data and are incorporated into the SME driver, and used to update the local variables described within the STELLA modules. In this case the level of transparency is certainly lower than with the STELLA modules. LHEM offers a framework to archive the modules that may be used either as stand-alone models to describe certain processes and ecosystem components, or may be put together into more elaborate structures by using the SME.

When applying the LHEM, or any other modeling library, the major complication for the user is to put together the modules in a meaningful and consistent way. In a prefabricated model, the issues of scale consistency are taken care of by the model developers beforehand. Now with the modular approach, the challenge of combining the modules in such a way that they match the complexity of the modeled system and are mutually consistent becomes the task of the library user. Once again this added concern is the price that is paid for the added flexibility and optim-ality of the resulting models. In theory, we can envision modeling systems that would keep track of the scales and resolutions of the various processes involved, and automatically allow links with only such modules that would match these scales. In practice, with all the complexity and uncertainty associated with ecological and socioeconomic systems, it may still be a while until such modeling tools appear. In the meanwhile the model transparency will be a very important prerequisite of modularity, especially if the modules are to be used in a research context.

Was this article helpful?

0 0
Oplan Termites

Oplan Termites

You Might Start Missing Your Termites After Kickin'em Out. After All, They Have Been Your Roommates For Quite A While. Enraged With How The Termites Have Eaten Up Your Antique Furniture? Can't Wait To Have Them Exterminated Completely From The Face Of The Earth? Fret Not. We Will Tell You How To Get Rid Of Them From Your House At Least. If Not From The Face The Earth.

Get My Free Ebook


Post a comment