It is currently impractical, and will remain so even if Moore's law of computational growth holds over the coming decades, to represent the global ocean at scales that resolve turbulence (i.e., centimeters). Increasing resolution to include smaller and smaller scales will yield more and more variability as turbulence is imperfectly simulated, but it is not clear whether or not this will lead to convergence on a more accurate solution. Thus, issues of model resolution and the representation of ocean turbulence will remain troublesome for the next several decades at least. These issues are particularly acute for biogeochemical models which typically add between 4 and as many as 25-50 state variables to the numerical model, which must be advected, diffused, and operated on within the biological and chemical submodels. At present, model applications tend to divide into those that can be run on desktop computers or small distributed memory clusters, which typically allow low to intermediate resolution and are more oriented to process studies. These are in contrast to applications that are run at supercomputing facilities which often trade off parameter optimization and repeat runs for very high spatial resolution, and are appropriate for long timescale climate integrations and ocean prediction studies.
Two approaches have been developed to reduce the computation demand in three-dimensional coupled marine modeling studies. The first is to run submodels such as ecosystem or chemical models 'off-line' or use the physical model output at much longer discrete time intervals to step the submodel forward. One problem with this approach is that it reduces feedback between the biological or chemical model and the physics, and also risks increased diffusion associated with the truncation of the temporal variability in the modeled physical fields. A second approach that is gaining favor is to nest a higher-resolution submodel within a larger coarser resolution model domain. This approach has been very successfully applied to problems associated with the interaction of the coastal ocean with the open ocean. Nesting can be done either '1-way', in which the models are not run concurrently, and the output from the larger domain model simply drives the boundary processes in the higher-resolution submodel. It can also be '2-way', in which the models are run concurrently, and the results from the higher-resolution submodel are also fed back to the larger domain allowing, for example, the effects of resolving western boundary current processes to feed back into basin scale simulations. The nesting method is not necessary with unstructured models, which can effectively run the fine inner mesh and the coarse outer mesh simultaneously. An example is the mesh used by the Hurricane Katrina simulation of the flow about levies near New Orleans, which included most of the North Atlantic to move tidal forcing boundary conditions far away. Because the mesh was so fine in the flooded areas, the addition of the coarse mesh areas only increased the computational effort by 10%.
Was this article helpful?
This book will surely change your life due to the fact that after reading this book and following through with the steps that are laid out for you in a clear and concise form you will be earning as much as several thousand extra dollars a month, as you can see by the cover of the book we will be discussing how you can make cash for what is considered trash by many people, these are items that have value to many people that can be sold and help people who need these items most.