Skip to main content
internal slider
The Science Case
science case
internal slider
The Science Case
science case

Decades of advances in high-performance computing (HPC) have brought us to a threshold. It is now possible to conceive of new classes of models[1], ones much better grounded in the laws of physics, providing less biased, more reliable, and hence useful, predictions and assessments of extremes. To operationally realize the potential of these models requires, however, at least a 1000-fold increase in the computational capacity realized by our most performant prediction systems[2].

Figure 4

 

For operational models of the global atmosphere and ocean, a factor of 1000 means that it becomes possible, for the first time, to replace crude statistical models of crucial climate processes (called 'parameterisations') by an explicit representation of these processes, one grounded in fundamental physics[3]. This would enable weather forecasts and climate projections by models that directly simulate the transient dynamics of crucial small (km) scale processes — ocean eddies, precipitating deep convection, gravity waves, ice fracture, the interaction of flows with local orography and bathymetry — whose detailed interaction with global circulation systems preconditions most weather and climate extremes[4]. Given the extent to which the current, and inadequate, representation of such processes limits our present ability to usefully anticipate future changes in climate and weather, such a development will be a game changer[5].

Figure 5

 

For models of the solid Earth, a 1000-fold increase in computational capacity would allow models of the solid Earth to break free from a dependence on statistical and piecemeal modelling approaches[6][7]. Only on this scale of computation is it possible to encapsulate advances in understanding of rupture phenomena, magma flow, and wave propagation in heterogeneous media and integrate it with large-scale models of the solid Earth. Multi-scale and multi-physics models expressing these couplings will enable predictions of the spatial and temporal distribution of earthquakes, their initiation and rupture and the resultant seismic shaking at frequencies most relevant for the built environment. Applied to volcanic eruptions, this computational capacity will lead to the first systems capable of predicting the initiation of an eruption, the space-time evolution of the eruption dynamics, and its remote impacts.

By creating models that can efficiently, consistently, and physically link small-scale and local processes to global circulation, fault, or large-scale magma systems, new horizons in high-end data analysis and data assimilation[8] are also opened. For instance, physical models integrating earthquake dynamics on complex fault geometries with real-time data assimilation from near-fault observatories will be constructed, to identify possible precursors and to map the initiation and evolution of rupture. Similarly, physical models of the coupled dynamics in volcanic sub-domains will be integrated with automatic signal detection from massive analysis of data from multi-parametric volcano monitoring systems to resolve the deep volcano dynamics and anticipate eruptions.

Across both fluid and solid Earth, the limitations in the physics of existing models render enormous amounts of information as unusable. Better, more physically based models that explicitly represent the scale at which diverse, heterogeneous, and increasingly unconventional observing systems collect their data, make it possible to assimilate the information. In addition, and not to be underestimated, the computational advances underpinning advances in the quality of models will enable ensemble approaches[9]. This, in itself, opens up new possibilities in data assimilation as well as quantification of uncertainties in a probabilistic framework.

 

[1] Bony et al., 2015: Clouds, circulation and climate sensitivity. Nature Geoscience, 8, 261–268.

[2] Bauer et al., 2015: The quiet revolution of numerical weather prediction. Nature, doi:10.1038/nature14956.

[3] Stevens and Bony, 2013: What Are Climate Models Missing? Science, 340(6136), 1053–1054.

[4] Zhang et al., 2003: Effects of moist convection on mesoscale predictability. Journal of the Atmospheric Sciences, 60, 1173-1185,

[5] Shepherd, 2014: Atmospheric circulation as a source of uncertainty in climate change projections, Nature Geoscience, 7, 703-708.

[6] Giardini et al., 2014: Mapping Europe’s seismic hazard. EOS, 95(29), www.efehr.org.

[7] Selva et al., 2012: Operational eruption forecasting at high-risk volcanoes: the case of Campi Flegrei, Naples. Journal of Applied Volcanology, 1: 5. https://doi.org/10.1186/2191-5040-1-5.

[8] Data assimilation techniques combine models with observations through complex mathematical algorithms for producing a physically consistent estimate of the state of a physical system at a given time. This state estimate is used, among others, for forecast initialisation.

[9] Ensembles of model predictions help characterize the prediction uncertainty arising from inaccuracies in initial state and models.