17.07.2015 Views

IPCC Report.pdf - Adam Curry

IPCC Report.pdf - Adam Curry

IPCC Report.pdf - Adam Curry

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 3Changes in Climate Extremes and their Impacts on the Natural Physical EnvironmentIn the case of statistical downscaling, uncertainties are induced by,inter alia, the definition and choice of predictors (Benestad, 2001;Hewitson and Crane, 2006; Timbal et al., 2008) and the underlyingassumption of stationarity (Raje and Mujumdar, 2010). In general, bothapproaches to downscaling are maturing and being more widely appliedbut are still restricted in terms of geographical coverage (Maraun et al.,2010). For many regions of the world, no downscaled information existsat all and regional projections rely only on information from GCMs (seeTable 3-3).For many user-driven applications, impact models need to be includedas an additional step for projections (e.g., hydrological or ecosystemmodels). Because of the previously mentioned issues of scale discrepanciesand overall biases, it is necessary to bias-correct RCM data before inputto some impacts models (i.e., to bring the statistical properties of presentdaysimulations in line with observations and to use this information tocorrect projections). A number of bias correction methods, includingquantile mapping and gamma transform, have recently been developedand exhibit promising skill for extremes of daily precipitation (Piani etal., 2010; Themeßl et al., 2011).3.2.3.3. Ways of Exploring and Quantifying UncertaintiesUncertainties can be explored, and quantified to some extent, throughthe combined use of observations and reanalyses, process understanding,a hierarchy of climate models, and ensemble simulations. Ensembles ofmodel simulations represent a fundamental resource for studying therange of plausible climate responses to a given forcing (Meehl et al.,2007b; Randall et al., 2007). Such ensembles can be generated either by(i) collecting results from a range of models from different modellingcenters (multi-model ensembles), to include the impact of structuralmodel differences; (ii) by generating simulations with different initialconditions (intra-model ensembles) to characterize the uncertaintiesdue to internal climate variability; or (iii) varying multiple internal modelparameters within plausible ranges (perturbed and stochastic physicsensembles), with both (ii) and (iii) aiming to produce a more systematicestimate of single model uncertainty (Knutti et al., 2010b).Many of the global models utilized for the AR4 were integrated asensembles, permitting more robust statistical analysis than is possible if amodel is only integrated to produce a single projection. Thus the availableCMIP3 Multi-Model Ensemble (MME) GCM simulations reflect both interandintra-model variability. In advance of AR4, coordinated climate changeexperiments were undertaken which provided information from 23 modelsfrom around the world (Meehl et al., 2007a). The CMIP3 simulationswere made available at the Program for Climate Model Diagnosis andIntercomparison (www-pcmdi.llnl.gov/ipcc/about_ipcc.php). However,the higher temporal resolution (i.e., daily) data necessary to analyzemost extreme events were quite incomplete in the archive, with onlyfour models providing daily averaged output with ensemble sizesgreater than three realizations and many models not included at all.GCMs are expensive to run, thus a compromise is needed between thenumber of models, number of simulations, and the complexity of themodels (Knutti, 2010).Besides the uncertainty due to randomness itself, which is the canonicalstatistical definition, it is important to distinguish between the uncertaintydue to insufficient agreement in the model projections, the uncertaintydue to insufficient evidence (insufficient observational data to constrainthe model projections or insufficient number of simulations from differentmodels or insufficient understanding of the physical processes), and theuncertainty induced by insufficient literature, which refers to the lack ofpublished analyses of projections. For instance, models may agree on aprojected change, but if this change is controlled by processes that arenot well understood and validated in the present climate, then there isan inherent uncertainty in the projections, no matter how good themodel agreement may be. Similarly, available model projections mayagree in a given change, but the number of available simulations mayrestrain the reliability of the inferred agreement (e.g., because theanalyses need to be based on daily data that may not be available fromall modelling groups). All these issues have been taken into account inassessing the confidence and likelihood of projected changes inextremes for this report (see Section 3.1.5).Uncertainty analysis of the CMIP3 MME in AR4 focused essentially on theseasonal mean and inter-model standard deviation values (Christensen etal., 2007; Meehl et al., 2007b; Randall et al., 2007). In addition, confidencewas assessed in the AR4 through simple quantification of the number ofmodels that show agreement in the sign of a specific climate change(e.g., sign of the change in frequency of extremes) – assuming that thegreater the number of models in agreement, the greater the robustness.However, the shortcoming of this definition of model agreement is thatit does not take account of possible common biases among models.Indeed, the ensemble was strictly an ‘ensemble of opportunity,’ withoutsampling protocol, and the possible dependence of different models onone another (e.g., due to shared parameterizations) was not assessed(Knutti et al., 2010a). Furthermore, this particular metric, which assessessign agreement only, can provide misleading conclusions in cases, forexample, where the projected changes are near zero. For this reason, inour assessments of projected changes in extreme indices we considerthe model agreement as a necessary but not a sufficient condition forlikelihood statements [e.g., agreement of 66% of the models, as indicatedwith shading in several of the figures (Figures 3-3, 3-4, 3-6, 3-8, and3-10), is a minimum but not a sufficient condition for a change beingconsidered ‘likely’].Post-AR4 studies have concentrated more on the use of the MME inorder to better characterize uncertainty in climate change projections,including those of extremes (Kharin et al., 2007; Gutowski et al., 2008a;Perkins et al., 2009). New techniques have been developed for exploitingthe full ensemble information, in some cases using observationalconstraints to construct probability distributions (Tebaldi and Knutti,2007; Tebaldi and Sanso, 2009), although issues such as determiningappropriate metrics for weighting models are challenging (Knutti et al.,2010a). Perturbed-physics ensembles have also become available (e.g.,131

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!