DIMACS Workshop on Geological data fusion: Tackling the statistical challenges of interpreting past environmental change

January 17 - 18, 2013
DIMACS Center, CoRE Building, Rutgers University

Robert Kopp, Rutgers University, Robert.Kopp at rutgers.edu
Frederik Simons, Princeton University, fjsimons at princeton.edu
Presented under the auspices of the DIMACS Special Focus on Information Sharing and Dynamic Data Analysis.


Kevin Anchukaitis, WHOI

Title: Climate information from tree-rings: state-of-the-art and existing challenges

Tree rings remain one of the primary proxy archives for developing high resolution climate reconstructions during the last millennium. These records are not thermometers nor rain gauges, however, but rather the result of multivariate and potentially nonlinear biological and physical processes that occur across a range of timescales. Here, I review the fundamental biophysical and methodological features of tree-ring proxies and discuss current challenges in dendroclimatology. These include forward modeling, detrending and standardization, the divergence problem, climate field and hemispheric temperature reconstructions, and the strength and limitations of the existing proxy network in space and time.

Patrick Applegate, Penn State

Title: First steps toward using geological data to reduce uncertainty in future sea level rise contributions from the Greenland Ice Sheet

Ice sheet model-based projections of sea level rise from the Greenland Ice Sheet are highly uncertain, suggesting a need for probabilistic calibration of such models using geologic data. Ice sheet models represent changes in ice distribution, velocity, and volume over time, due to ice flow, snowfall, and surface melt (among other processes). International projects that use these models to project future sea level change tune each model against the observed modern state of the ice sheet, then perform one projection per model and climate scenario. However, the models include many uncertain parameters, and tuning solely against the modern ice sheet state raises questions about whether the models respond appropriately to climate forcings. Our recent work (Applegate et al., 2012, The Cryosphere) estimates the contribution of model parametric uncertainty to the spread of possible future ice volume changes. In that study, we obtained a range of potential sea level rise contributions from the Greenland Ice Sheet that is 40-70% of the central estimate. In this talk, we present data sets that could be used to tune ice sheet models in a probabilistic way, and we show preliminary results from our efforts to perform this tuning. Further development of this approach will yield projections of Greenland ice volume change with uncertainties, as reduced by tuning of the models to appropriate data on the ice sheet's past behavior.

Michael Dietze, Boston U.

Title: Assimilating paleoecological data into land surface & biogeochemical models

Terrestrial land surface and biogeochemical models are used to forecast ecological responses to global-change drivers, yet their predictions are highly divergent and have large uncertainties. Furthermore, testing and calibration of these models is primarily based on sub-daily to decadal data, which fail to capture long term trends and infrequent extreme events. The capacity of these models for scientific inference and long-term prediction would be greatly improved if uncertainties can be reduced through rigorous testing against long-term observational data.

PalEON is an interdisciplinary team of paleoecologists, statisticians, and modelers partnered to rigorously synthesize paleoecological data and ecosystem models. Our aim is to gain a deeper understanding of past dynamics and to use this knowledge to improve long-term forecasting capabilities. Our data-model integration focuses on four objectives and associated research questions: 1) Validation: How well do ecosystem models simulate decadal-to-centennial dynamics when confronted with past climate change, and what limits model accuracy? 2) Initialization: How sensitive are ecosystem models to initialization state and equilibrium assumptions? Do data-constrained simulations of centennial-scale dynamics improve 20th- century simulations? 3) Inference: Was the terrestrial biosphere a carbon sink or source during the Little Ice Age and Medieval Climate Anomaly? and 4) Improvement: How can parameters and processes responsible for data-model divergences be improved?

Vivien Gornitz, Columbia University

Title: A Rapid Ice-Melt Sea Level Rise Scenario Based on the Last Glacial Termination, as an Analog

Recent trends in global sea level rise and increasing contributions from icemelt suggest that the IPCC (2007) future projections already appear too conservative. Global climate models may be underestimating sea level rise, in large part because they do not adequately represent all of the physical processes involved in ice melting. Growing concern for increasing icemelt trends indicates that alternative approaches are needed for future sea level rise projections. One such approach examines period of rapid paleo-sea level change as an analog. In work for the New York City Panel on Climate Change (2010) and the New York State ClimAID (2011), the Last Glaciation Termination was selected as an analog. Following the last ice age, sea level rose 120m starting gradually around 20,000 years ago and picking up speed between around 16,000 and 7,000-8,000 years ago. The average rate of rise over a 10,000-12,000 year period was around 10-12 mm/yr (0.39-0.47 in/yr). Although this rise was punctuated by several, shorter, even more rapid spurts known as "meltwater pulses", some of which lasted several centuries, these outbursts are unlikely to be matched during the 21st century, since they occurred after the ice sheets had already been weakened by many centuries of prior forcing and the total ice extent was much greater than today. The N4Rapid Ice-MeltN! scenario assumes that glaciers and ice sheets will melt at an average rate comparable to that of the last deglaciation (i.e., 10-12 mm/yr [0.39-0.47 in/yr]), raising sea level by up to 1 m by 2100. An exponential curve is fitted to average ice melt rates during a start period (2000-2004) going to 1 m by 2100. The other contributions to sea level rise are also added (i.e., global thermal expansion, local ocean dynamic height, land subsidence) from GCM and GIA models for three time slices (2020s, 2050s, 2080s). This scenario yields a total sea level rise of ~0.94-1.50 m (37-59 in) by the 2080s for New York City, comparable to recent estimates from other sources. Potential revisions to the rapid ice-melt scenario will be discussed, based on new information regarding the deglaciation and the latest trends in ice melt.

Julia Hargreaves, JAMSTEC

Title: Can the Last Glacial Maximum constrain climate sensitivity?

Paleoclimate simulations provide us with an opportunity to critically confront and evaluate the performance of climate models in simulating the response of the climate system to changes in radiative forcing and other boundary conditions. We use recent data syntheses to analyse the skill and reliability of the multi-model ensembles from the second Paleoclimate Modelling Inter-comparison Project (PMIP2) for the Last Glacial Maximum and Mid-Holocene. Obtaining promising results for the Last Glacial Maximum, we further use the syntheses of proxy data both on land and ocean (MARGO Project Members, 2009; Bartlein et al., 2011; Shakun et al., 2012), combined with the PMIP2 ensemble to generate a spatially complete reconstruction of surface air (and sea surface) temperatures, obtaining an estimated global mean cooling of 4.0 N1 0.8 N0C (95% CI). Finally we investigate the relationship between the Last Glacial Maximum (LGM) and climate sensitivity across the PMIP2 multi-model ensemble of GCMs, and find a correlation between tropical temperature and climate sensitivity which is statistically significant and physically plausible. We use this relationship, together with the LGM temperature reconstruction, to generate estimates for the equilibrium climate sensitivity. We estimate the equilibrium climate sensitivity to be about 2.5C with a high probability of being under 4C, though these results are subject to several important caveats. We propose that the forthcoming PMIP3 ensemble of models will provide a useful validation of the correlation presented here.

Linda Hinnov, Johns Hopkins

Title: Complex signal analysis of paleoclimatic time series

The recovery of ever-longer paleoclimatic time series has solved many analysis and modeling problems in paleoclimatology, but has also led to new problems. Longer time series provide much-needed, improved frequency resolution, but paleoclimatic change and its recorder(s) tend to drift in frequency over long, multi-million year timescales. If not addressed, such drifting can introduce significant errors to measured spectra and other analytical procedures. Here quadrature methods are used to obtain the complex signal of a paleoclimatic time series, which in turn is used to estimate "instantaneous" amplitude, frequency and phase attributes of the signal. These attributes are evaluated as tools for constraining timescale error, for extracting planetary resonance modulators from astronomically forced data, and for detecting phase aberration in astronomical frequencies due to climate friction.

Ben Horton, U. Penn

Title: Sea-level change along the Atlantic coast of the United States

The rate of sea-level rise along the US Atlantic and Gulf coasts increased through the 20th century and will almost certainly continue to accelerate during the 21st century and beyond, although significant uncertainty surrounds the likely magnitude and regional variability. Key uncertainties include the role of the Greenland and West Antarctic ice sheets, mountain glaciers and ocean density (steric) changes. Insufficient understanding of these physical processes has precluded accurate prediction of sea-level rise. New approaches using semi-empirical models that relate instrumental records of climate and sea-level rise have projected up to 2 m of sea-level rise by AD 2100. But the duration of instrumental records is insufficient to adequately constrain the climate-sea-level relationship.

We have produced new high-resolution reconstructions of sea-level change for the last 2000 years along the US Atlantic Coast spanning the alternation between the so-called "Medieval Climate Anomaly", N4Little Ice AgeN! and 20th century warming. Innovative micro-fossil-based transfer functions from salt-marsh sediments are used to generate sea-level records with a vertical resolution of N1 0.1-0.3m. Combining this approach with a suite of complementary dating methods provides the ability to precisely constrain the chronology (decadal to centennial age resolution) of subtle changes in sea level. We have used the proxy data of sea-level and global temperature reconstructions to provide crucial additional constraints to the parameters in semi-empirical models of sea-level rise.

Before the models can provide appropriate data for coastal management and planning, they must be complemented with regional estimates of sea-level rise. The proxy sea-level data collected from five study areas (Massachusetts, Connecticut, New Jersey and North Carolina and Florida) exposes regional variability due to glacial isostatic adjustment (GIA) of the solid Earth. In New Jersey, Massachusetts and Connecticut GIA corrected sea level was stable from at least BC 200 until AD 500. Sea level then increased at a rate of less than 1 mm/yr resulting from/associated with the Medieval Climate Anomaly. In North Carolina the rise in sea level began slightly later at AD 950. All records show stable or falling sea level between AD 1400 and the late 19th century at the time of the Little Ice Age. Since then, sea level has risen at greater than 2 mm/yr, representing the steepest century-scale increase of the past two millennia.

Bob Kopp, Rutgers University

Title: Bayesian inference on sea level and ice volume history during past interglacials

Sea level rise -- driven in part directly by changes in ocean temperature and in part by melting land ice -- figures prominently among the effects of a warming climate. Melt dynamics are, however, complicated and challenging to project using forward models. The geological record of past sea level changes provides a complementary source of information about ice sheet stability. Past warm periods, such as the Last Interglacial stage, have the potential to provide insights into steady-state ice sheet behavior under modestly warmer conditions. Yet this record is composed of proxies that are uncertain in their meaning, uncertain in their ages, and reflect sea level as seen through the filter of a range of physical process that cause local sea level change to deviate, and sometimes even differ in sign, from changes in mean global sea level. To infer past sea level and ice sheet changes from geological observations, we adopt a Bayesian framework based on Gaussian process techniques developed in the machine learning community. A prior probability distribution for sea level and ice volume can be constructed by combining a constraint total ice volume from the marine oxygen isotope record and physical modeling of the sea level response to ice volume changes. Either sampling or approximation approaches allow the incorporation of geochronological uncertainty. An initial implementation of this framework applied to the Last Interglacial stage (~125 thousand years ago) revealed global mean sea level well in excess of modern: a 95% probability that sea level exceeded 6.6 m higher than today, a 67% probability that it exceeded 8.0 m, and a 33% probability that it exceeded 9.4 m.

Stephen Meyers, U. Wisconsin

Title: Patterns in Static

The attribution of paleoclimate change to particular mechanistic drivers such as decadal to millennial-scale solar variability, or longer-term quasi-periodic Milankovitch orbital forcing often relies upon spectral analysis of climate proxy data. To guard against false attribution of fluctuations that are due to stochastic internal variability, climate proxy data sets are typically evaluated against one or more null 'noise' hypothesis, such as a first-order autoregressive model. A fundamental challenge in such hypothesis testing is the accurate estimation of noise model parameters in the presence of deterministic quasi-periodic signals, which can serve to bias parameter estimates and reduce the statistical power of the tests. Stated differently, strong deterministic signals serve to contaminate the noise estimate, the latter of which may also be of interest in its own right for constraining geophysical mechanisms of internal paleoclimate variability. Here I will review the problem, demonstrate important limitations of commonly used parametric approaches, and present a new alternative methodology, which highlights the utility of robust spectral background estimation techniques and appropriate 'pre-whitening' procedures.

Gary Mitchum, U. South Florida

Title: Can We Determine Sea Level Rise Acceleration from the Instrumental Record?

Sea level from tide gauges and sea surface height from satellite altimeters have long been used to estimate sea level change rates. Attention is now turning, appropriately, to making estimates of the sea level change acceleration, which is a much more difficult problem. I will present a series of calculations aimed not at making acceleration estimates, but on fairly determining the errors in such estimates. These are used to ask whether it even makes sense to attempt to quantify acceleration at present. I conclude it is very problematic using the tide gauges, but with some additional time, satellite altimeters will be able to make estimates that are precise enough to be of use in constraining projections of sea level change over the coming decades.

Chris Paciorek, Berkeley

Title: Using spatio-temporal statistical modeling for paleoecological reconstruction and uncertainty characterization

Paleoecological research involves a range of inferential challenges rooted in the nature of the available proxy data. I'll give a brief overview of the data used by paleoecologists to make inference about past vegetation and disturbance, the challenges posed by proxy data that are sparse in space and time, and the methods used in the paleoecological literature. I'll then turn to our recent use of hierarchical statistical models to address some of these challenges. We have developed hierarchical Bayesian spatio-temporal models for vegetation that calibrate pollen data to vegetation composition at time periods with vegetation data. The modeling approach then borrows strength in space and time to predict vegetation in the past from the pollen record, while accounting for a variety of sources of uncertainty. I'll conclude by outlining the goals of the PalEON project, which aims to use statistical estimates of past ecological processes to assess and improve ecosystem models.

Jeffrey Park, Yale University

Title: Spectral Coherence Evidence for Oceanic Control of Interannual Carbon Cycle Feedbacks

Large-scale carbon-cycle feedbacks within Earth's climate system can be inferred from the statistical correlation of atmospheric CO2 and other climate observations. These statistical relationships can serve as validation targets for global carbon-cycle models. Fourier-transform coherence between atmospheric CO2 measured at Mauna Loa, Hawaii, and Hadley Centre global-average temperatures changed in the late 20th century at interannual frequencies, from a 6-month time lag to a 90N0 phase lag that scaled CO2 fluctuations to a time-integral of the global-average temperature anomaly. Wavelet coherence estimates argue that this change occurred with a recognized ocean-circulation climate transition during the late 1970s.

The CO2-coherence phase for the global-average surface-air temperature time series from NASA-GISS and the lower-troposphere temperature series from the MSU satellite is more complex than for the Hadley-Centre dataset, which is the only estimate that incorporates sea-surface temperature (SST) observations. Coherence of CO2 variations with gridpoint temperature-anomaly time series from low-latitude oceans suggests that sea-surface temperature is a primary driver of the correlation, at least for the 0.2 < f < 0.5 cyc/yr bandpass where the El-Nino/Southern-Oscillation (ENSO) climate process dominates. Evidence for terrestrial biosphere influence is strongest in the leading principal component of GLOBALVIEW CO2-variability at f 0.25 cpy, where a larger amplitude and a 4-month phase shift distinguish the mid- and high-latitude Northern Hemisphere CO2 fluctuations from those of the tropics and the Southern Hemisphere. The terrestrial signal we infer, however, coheres more strongly with oceanic-gridpoint temperatures than to continental-gridpoint temperatures.

Shanan Peters, U. Wisconsin

Title: The process signal of gaps in the continental shelf and deep sea sedimentary records

The sedimentary record is often described as incomplete, principally because on the short spatial scales of most studies, time is typically represented by both sediment volumes and surfaces of erosion and/or non-deposition. The later constitutes classical "incompleteness" in the sense that records extracted from sedimentary rocks, usually at one location, cannot be used to reconstruct a continuous time series. However, many of the processes that are of interest to geoscientists not only affect proxy records, but they also affect the spatiotemporal distribution of sedimentation and erosion/non-deposition. Thus, there is considerable information content in the "incompleteness" of the sedimentary record, and this process signal is measurable using the analytical approach of macrostratigraphy. Although the effective temporal resolution of macrostratigraphy is limited by the ability to correlate surfaces and sediment volumes, basin-scale applications and forward modeling highlight the ability of quantitative analyses of unconformities and hiatuses to characterize important environmental changes and to complement many proxy records thereof. A collaboratively generated working hypothesis for the spatiotemporal distribution of sediment/rock volumes and their bounding surfaces, and a robust cyberinfrastructure to manage that hypothesis and its underlying evidence, could greatly facilitate progress in addressing some of the challenges of geological data fusion.

Bala Rajaratnam, Stanford University

Title: Novel high dimensional statistical methodology for multiproxy paleoclimate reconstructions

Various climate field reconstructions (CFR) methods have been proposed to infer past temperature from (paleoclimate) multiproxy networks. We propose a new climate field reconstruction method that aims to use recent advances in statistics, and in particular, high dimensional covariance estimation to tackle this problem. The new CFR method provides a flexible framework for modeling the inherent spatial heterogeneities of high-dimensional spatial fields and at the same time provide the parameter reduction necessary for obtaining precise and well-conditioned estimates of the covariance structure of the field, even when the sample size is much smaller than the number of variables. Our results show that the new method can yield significant improvements over existing methods, with gains uniformly over space. We also show that the new methodology is useful for regional paleoclimate reconstructions, and can yield better uncertainty quantification. We demonstrate that the increase in performance is directly related to recovering the underlying structure in the covariance of the spatial field. We also provide compelling evidence that the new methodology performs well even at spatial locations with few proxies. (The work is based on joint work with D.Guillot and J. Emile-Geay).

Gavin Schmidt, NASA GISS

Title: Climate sensitivities

The response of the real world to an increase in CO2 is one of the key targets needed to assess climate change impacts. However, the definition of 'climate sensitivity' is sometimes unclear, and the variations of climate sensitivity as a function of timescale, included processes and modeling frameworks are important. I will discuss the different approaches - ranging from the Charney climate sensitivity to the Earth System Sensitivity and the prospects for constraining them using paleo-climate data.

Rina Schumer, Desert Research Institute

Title: Inference on earth surface evolution from the stratigraphic record

Time and topography are represented in strata in distorted, or filtered, form because the depositional record is incomplete and the autogenic dynamics of geomorphic processes that create it are known to be partially stochastic. Here we define a stochastic process called the -Y´stratigraphic filterĄ and consider both the forward and backwards problems of stratigraphy generation and interpretation. Stratigraphic hiatuses can arise from either long periods of non-activity or alternating periods of deposition and erosion. The two types can produce different effects on statistics of measured rates.

Two measurable characteristics of the stratigraphic record show statistical regularity over a range of environments:

  1. The Layer Thickness Inventory (LTI) was developed as an unbiased measure of layering in lithologic logs. The ´fractal nature' of the stratigraphic record as inferred from the LTI leads to the conclusion that geophysical series have negative long range negative dependence with depth.
  2. The Sadler Effect, in which calculated average linear deposition rate is a decreasing power-law function of measurement interval, is known to arise when the distribution of hiatus lengths in the record are also power-law. These power-laws arise when earth surface fluctuations have negative long range negative dependence.

Long range negative dependence refers to a series with long term switching between high and low values and is described by a correlation function that decays very slowly (as a power law). Why should such long range correlation exist in surface fluctuations?

Recent work that generalizes the Edwards-Wilkinson equation for surface growth reveals that the time series of elevation at a point on an evolving surface will exhibit the negative long-range dependence described above as a result of spatial correlations with the rest of the system. This implies that statistical characteristics of bed thickness and hiatuses arise directly from statistics of three-dimensional erosion, transport, and deposition of sediment. This has implications for detection and identification of cycles in the stratigraphic record (null hypothesis should be red noise) and improvement of subsurface geostatistical models.

David Thomson, Queen's University

Title: Some observations on the problem of recovering time scales in paleo data

A recurring problem in the analysis of paleoclimate data is that of estimating the time scale. This is particularly serious with holocene data because there are few suitable ``clocks'', so comparisons to reference series are used. After some preliminary comments, I describe a method to estimate dating errors assuming that they are relatively small, slowly varying, but not necessarily independent of the data. The basic expansion is similar to non-stationary quadratic-inverse theory.

Jessica Tierney, WHOI

Title: Time-uncertainty in paleoclimate proxy records and implications for climate reconstruction

Most paleoclimate archives are endowed with a certain amount of uncertainty in the time domain related to the method used to date the archive. This time-uncertainty can occasionally be large relative to the timescale of geological inquiry. In particular, time-uncertain data pose a challenge to climate reconstruction in the past millennium and have not been formally addressed in reconstruction methods commonly used to date. This talk will review the different types of time-uncertainty present in paleoclimate archives, discuss implications for the paleorecord and climate reconstruction, and propose possible solutions.

Martin Tingley, Harvard University

Title: On the simultaneous inference of past temperatures and climate sensitivity,or: How to do more work that you thought necessary to get what you wanted all along for free

Bayesian Hierarchical Models are well suited for combining numerous sources of uncertain data and prior scientific understanding -- as is the case for paleoclimate reconstruction problems. Hierarchical models necessitate a clear statement of scientific assumptions, while Bayesian inference results in thorough uncertainty quantification. These gains are admittedly not free, and fitting Bayesian Hierarchical Mode generally involves substantial costs in the form of programming and executing problem-specific Markov Chain Monte Carlo routines. To demonstrate the rich return-on-investment from buying into the Bayesian Hierarchical Modeling paradigm, I will present results from two related analyses of the late Holocene paleoclimate record. The first analysis will show that recent warm extremes, such as the 2010 Russian Heat Wave, can be accounted for with a simple shift in the underlying mean of the climate, without recourse to changes in variability. The second analysis includes estimates of climate forcings within the Bayesian Hierarchical Model, and thus requires inference on a parameter which links changes in the temperature to changes in greenhouse gas concentration. This scaling parameter is directly related to the standard definition of transient climate sensitivity, which is estimated from 2000 years of proxy data to be about 2.25C to 3.5C.

Nathan Urban, LANL

Title: Climate sensitivity estimated from the Last Glacial Maximum

A Bayesian method is developed to estimate the climate sensitivity to carbon dioxide from paleo-temperature reconstructions of the Last Glacial Maximum. The method uses proxy data to constrain feedback parameter settings in a perturbed-physics ensemble using UVic, a climate model of intermediate complexity with a 3D dynamic ocean and an energy-moisture balance atmosphere. Combined land and ocean temperature proxies indicate a climate sensitivity range of 1.7 to 2.6 K, but there is a tension between the two data sources: ocean data constrain climate sensitivity from 1.3 to 2.7 K, but land data constrain it from 2.2 to 4.6 K. This suggests an inconsistency between proxies or the model's representation of the land-ocean contrast. Finally, the sensitivity of the conclusions to physical and statistical assumptions is examined.

Jane Willenbring, U. Penn

Title: Birth, life and fate of continental sediment in the wake of climate change and mountain uplift

Sediment observed in outcrops and cores can sometimes be all that we have to understand the geologic past. Recent work has questioned whether and under which conditions climate change, tectonics and Earth-surface dynamics are recordable in the sedimentary record. In this talk, I will consider, from a population dynamics perspective, two related questions that are now answerable using cosmogenic nuclides: 1. Where is the most likely origin of a particle of sediment going into the ocean? and, 2. How does time alter sedimentary records and add bias to our view into the past? Cosmogenic nuclides provide a unique, relatively new tool to answer these questions because the concentrations record denudation rates that naturally average over timescales similar to those of climatic, tectonic and geomorphic change. Moreover, denudation as measured by cosmogenic nuclides is not a reversible processes like sedimentation.

Previous: Program
Workshop Index
DIMACS Homepage
Contacting the Center
Document last modified on January 10, 2013.