Rapid extreme weather attribution, aka climate ambulance chasing, has come of age and the attributors are in high demand:

For Friederike Otto, a climate modeller at the University of Oxford, UK, the past week has been a frenzy, as journalists clamoured for her views on climate change’s role in the summer heat. “It’s been mad,” she says. The usual scientific response is that severe heatwaves will become more frequent because of global warming. But Otto and her colleagues wanted to answer a more particular question: how had climate change influenced this specific heatwave? After three days’ work with computer models, they announced on 27 July that their preliminary analysis for northern Europe suggests that climate change made the heatwave more than twice as likely to occur in many places.

Soon, journalists might be able to get this kind of quick-fire analysis routinely from weather agencies, rather than on an ad hoc basis from academics. With Otto’s help, Germany’s national weather agency is preparing to be the first in the world to offer rapid assessments of global warming’s connection to particular meteorological events. By 2019 or 2020, the agency hopes to post its findings on social media almost instantly, with full public reports following one or two weeks after an event. “We want to quantify the influence of climate change on any atmospheric conditions that might bring extreme weather to Germany or central Europe,” says Paul Becker, vice-president of the weather agency, which is based in Offenbach. “The science is ripe to start doing it”.

So it will be just like getting the weather report, albeit just a few days or even just hours after the event, but in this case, you’ll get to know whether storm, heatwave, drought, deluge or freezing spell is simply ‘weather’, or whether it is all the fault of those nasty anthropogenic CO2 molecules which keep popping up in our atmosphere courtesy of our exhaust pipes and modish technological lifestyles. Presumably, if found guilty, we can then all engage in a universal display of repentance, lashing ourselves with the electrical flex ripped from the back of the new dishwasher or cinematic 60 inch flat screen TV with 3D surround sound, slash the tyres on the 4×4, and say ‘it’s a fair cop’ when the insurance company rejects our claim for damage to the roof because it was ‘not a natural disaster’. Oh yes, ’twill be fun.

But before we get too carried away, and start thinking about smashing up the old C-rated washing machine in an anguished fit of remorse, let’s take a look at the supposed damning scientific evidence that we did in fact cause ourselves to cook this summer in what will probably turn out to be the hottest, longest and driest heatwave in parts of northern Europe since way back when it was last this hot and dry. Take it away World Weather Attribution:

Here we present an attribution study of the ongoing heat wave made in near real time using well assessed methodologies. It is not peer-reviewed and was written quickly. We used thoroughly tested methods to do the analysis, evaluation of models and checked the observations for errors. The return times are partly based on forecasts and therefore have additional uncertainties. However, the changes in probability are based on past observations and model results, and the authors are confident that these results are robust. We are very grateful to Peter Thorne and Peter Thejll for making the Irish (from Met Eiréann) and Danish (from DMI) temperature observations available to us.

A summary of the key findings of this quick-fire attribution:

  • The heat (based on observations and forecast) is very extreme near the Arctic circle, but less extreme further south: return periods are about 10 years in southern Scandinavia and Ireland, five years in the Netherlands
  • From past observations and models we find that the probability of such a heatwave to occur has increased everywhere in this region due to anthropogenic climate change, although in Scandinavia this increase was not visible in observations until now due to the very variable summer weather.
  • We estimate that the probability to have such a heat or higher is generally more than two times higher today than if human activities had not altered climate.
  • Due to the underlying warming trend even record breaking events can be not very extreme but have relatively low return times in the current climate.
  • With global mean temperatures continuing to increase heat waves like this will become even less exceptional

The authors then define the event thus: “To define the event, we analyse the three-day maximum temperature average (TX3x) at individual locations.” This produces the heat map below:

fig2-27july

Anomalies are with respect to the 1981-2010 climate normal. You will notice immediately that it has not been very hot, in fact really quite cool, across most of southern Europe. Presumably this is why the authors chose a number of weather stations in Ireland, northern Europe and Scandinavia to analyse the event. The station locations are:

  • Phoenix Park (Dublin, Ireland, 53.36N, -6.32E, 49.0m),
  • De Bilt (Netherlands, 52.10N, 5.18E, 1.9m),
  • Landbohøjskolen (Copenhagen, Denmark, 55.7N; 12.5E, 9m),
  • Oslo Blindern (Norway, 59.94N, 10.72E, 94.0m),
  • Linköping (Sweden, 58.40N, 15.53E, 93.0m),
  • Sodankyla (Finland, 67.37N, 26.63E, 179.0m) and
  • Jokioinen (Finland, 60.81N, 23.50E, 104.0m).

The authors explain the rationale:

In this article we do not analyse large area averages or country averages as in previous analyses of high temperatures but focus instead on a number of individual locations in Northern Europe where long records of observed data are available.

We firstly analyze observed temperatures and estimate how rare the current heat wave is, measured as three-day maximum temperatures, and whether or not there is a trend toward increasing temperature.

So basically, the actual analysis is only applicable for these 7 station locations, though obviously, regions outside these locations were affected by the anomalous weather too, the UK in particular; hence the numerous comparisons to 1976, 1995, 2003 etc. summer heatwaves.

They analyse the observed temps to see how rare they are in the context of the existing historical records and they also compare these with models:

Secondly, to assess the role of climate change, we compare observations with results from climate models that are available and suitable for the temperatures in these locations. This answers the question whether and to what extent external drivers, in particular human-caused climate change, can explain the temperature trends in the observational data. Including models allows us to give the causation of a trend.

This makes it very clear that the attribution stems from the use of the models. Now here’s where the fun starts:

 

Key Assumptions About AGW And Natural Variability

For transient simulations of the changing climate, we again calculate how the probability of the event is changing over time in the model data, by fitting the temperature values to a distribution that shifts proportional to the smoothed global mean temperature. This method assumes that global warming is the main factor affecting local temperatures since about 1900, and that virtually all global warming is attributable to anthropogenic factors. In Europe, with very little decadal variability, the first condition is met. The second assumption is the conclusion of the IPCC, but disregards the uncertainty surrounding it (IPCC WG1 AR5 Chapter 10).

My bold. The authors assume that all global warming since 1900 is anthropogenic and that this is the main factor affecting temperature at the specified locations. They justify these assumptions by reference to IPCC WG1 AR5 Ch. 10. They also state, bare-faced, that northern European climate is subject to very little decadal variability . . . .

Let’s deal with the second assumption first, that most or all global warming since 1900 is anthropogenic. Let us go to AR5 WG1 Ch. 10 and see what it says about the attribution of warming post 1900. When we do, we find it has an awful lot to say about attribution post 1950, stating it is extremely likely that most or all of the warming 1951-2010 is due to GHGs. 1900 to 2010, not so much. What it does say is this:

The pattern of warming and residual differences between models and observations indicate a role for circulation changes as a contributor to early 20th cenury warming (Figure 10.2), and the contribution of internal variability to the early 20th century warming has been analysed in several publications since the AR4. Crook and Forster (2011) find that the observed 1918–1940 warming was significantly greater than that simulated by most of the CMIP3 models. A distinguishing feature of the early 20th century warming is its pattern (Brönnimann, 2009) which shows the most pronounced warming in the Arctic during the cold season, followed by North America during the warm season, the North Atlantic Ocean and the tropics.

Some studies have suggested that the warming is a response to the AMO (Schlesinger and Ramankutty, 1994; Polyakov et al., 2005; Knight et al., 2006; Tung and Zhou, 2013), or a large but random expression of internal variability (Bengtsson et al., 2006; Wood and Overland, 2010)

Nonetheless, these studies do not challenge the AR4 assessment that external forcing very likely made a contribution to the warming over this period. In conclusion, the early 20th century warming is very unlikely to be due to internal variability alone. It remains difficult to quantify the contribution to this warming from internal variability, natural forcing and anthropogenic forcing, due to forcing and response uncertainties and incomplete observational coverage.

How does this square with the authors’ statement that “the second assumption is the conclusion of the IPCC, but disregards the uncertainty surrounding it (IPCC WG1 AR5 Chapter 10)”? Answer: it simply doesn’t. It is not their conclusion. They state that the uncertainties do not permit a quantifiable attribution of warming pre 1950, only that AGW likely made a contribution. In actual fact, numerous recent studies affirm the dominant role in 1910-1940s global warming of the AMO mode of internal variability combined with solar forcing, especially for the Arctic and locations bordering the North Atlantic (which just happens to be the region where the current heatwave is taking place!).

The IPCC statements also blow a hole in the first assumption of the report’s authors, i.e. that the station locations they’re analysing are subject to little decadal variability. The IPCC clearly mention the importance of the AMO which affects precisely this region. Northern European climate is subject to a lot of multi-decadal internal variability, principally by the AMO which has a cycle length of approximately 70 years, which is more than sufficient to have affected temperatures at the specified locations from 1900 to the present! For the avoidance of doubt in this respect:

The Atlantic multidecadal oscillation (AMO) explains over 90% of the pronounced decadal temperature and summer precipitation variation. Understanding the impact of these ocean variations when interpreting long climate records, particularly in the context of a changing climate, is crucial.

[https://rmets.onlinelibrary.wiley.com/doi/full/10.1002/wea.2543]

That rather buggers up the supposedly eminently suitable data from Phoenix Park, Dublin, with regards to demonstrating the probability of extreme weather supposedly unaffected by the presence of natural climate variability, subject only to anthropogenic influences!

The story is not much different for the other locations, all of which are subject to the influence of Atlantic multi-decadal variability.

Thus the two main assumptions of the WWA rapid attribution report are proven false:

  1. The locations used are subject to significant natural multi-decadal/decadal variability.
  2. Anthropogenic GHGs are unlikely to have dominated global warming prior to 1950.

Not good, but that’s not the end of it.

 

The Observations Disagree With The Models

As mentioned, the analysis uses models to estimate the likely anthropogenic influence. Here’s what the authors say about the model fit vs. data at the various stations:

Dublin Phoenix Park: Only two models had variability compatible with the observations (EUROCORDEX has 30% too much variability but we decided to keep it; EC-Earth is fine but is also downscaled by RACMO, which fits the observations even better). The Risk Ratio is about two in all estimates, the straight average of observations and the two models gives a Risk Ratio of 2.0 with a 95% uncertainty interval of 1.2 …  3.3.

De Bilt: Observations show a much bigger increase in probability than the models, which also show some spreads. We verified that the trend is the same at the other homogenised non-coastal stations in the Netherlands (Eelde and Maastricht). Starting in 1951, and hence avoiding the breaks altogether, gives even higher trends and hence Risk Ratios. The same mismatch extends into Central Europe and has been discussed by Min et al., 2012 and Sippel et al., 2016. The cause is as yet unknown, so the best we can do at this moment is to average these values (on a logarithmic scale) leading to an increase in probability of very roughly a factor three, and definitely larger than one.

Copenhagen Landbohøjskolen: Both models with variability that is compatible with the observations show a somewhat stronger increase than the observed trend, but within uncertainties the results are reasonably compatible (χ²/dof=3.4). The average Risk ratio is roughly five (larger than 2.4).

The Scandinavian stations are even worse:

Oslo Blindern: The observed trend has large uncertainties and is therefore compatible with both models that have realistic variability. The two models with realistic variability agree well. The average gives a Risk Ratio between 1 and 10, so we can say that the probability increased but not very well by how much.

Linköping (Sweden): Again there is huge uncertainty in the observed trend. The EUROCORDEX ensemble has a higher trend than the other two models, so we cannot say much more than there is definitely an increase in probability for heat waves.

Sodankyla: In the high Arctic, the uncertainties both on the observations and on the climate models are enormous. Given the strong positive trends in the climate models, and the possibility of a positive trend in the observations, we can state that the trend is positive but nothing more. Note that the scale extends over eight orders of magnitude.

Jokioinen: The same holds for the more southerly station in Finland.

The authors explain the rationale behind choosing individual stations, even though the mismatch between observations and models is plainly a problem. Basically, it’s because real people live in these locations, not statistically spread out across wide regions, so the attribution result is more pertinent!

In our analysis we have looked at local weather stations in northern Europe where people are experiencing the heat wave today. In individual locations the interannual variability in summer temperatures is much larger than if we would average over countries or the whole region that is experiencing a heat wave in the early summer of 2018. The climate change signal is thus harder to detect from the noise in individual places. However, it is towns and villages where people live and not area averages, hence we chose to focus on stations only in this analysis.

In summary, here is what the authors have to say about the observations and the models for the particular locations used in their attribution study:

In Ireland, the Netherlands and Denmark there are clear trends towards more heat waves in the observations. In Ireland and Denmark climate models give a very similar increase in probabilities to the observations — roughly a factor two more likely in Dublin and a factor four in Denmark. In the Netherlands the observed increase in hot extremes is much larger than the modeled increase. This is a well-known problem (Min et al, 2013, Sippel et al, 2016) but the cause has not yet been elucidated.

However, this is not the case further north. No trend is currently detectable in the observations there. However, due to the large variability of the summer weather, the uncertainty in the trends is so large that the observations are also compatible with large changes in the likelihood of heat extremes to occur. In the case of Scandinavia, the models are thus probably a better source to estimate the change in likelihood as we have large ensembles of model experiments.

Does all this inspire much confidence in the use of these models and observations combined to attribute the influence of anthropogenic climate change upon the 2018 heatwave in Europe and Scandinavia?

Does it inspire much confidence in this Guardian headline, outlining the results of this rapid attribution study?

Heatwave made more than twice as likely by climate change, scientists find

Fingerprints of global warming clear, they say, after comparing northern Europe’s scorching summer with records and computer models

 

 

I have to say: no, it bloody well doesn’t! But, as usual, it’s out there now, all across the media airwaves; climate alarmists everywhere are saying ‘the science is in – climate change did cause the 2018 northern European heatwave’. 99% of them probably haven’t even bothered to read the actual study which purports to be ‘the science’.

Those of us who do not leap at the opportunity to believe in the magic science of extreme weather attribution will note the massive and unexpected cooling in the tropical and Northern Atlantic this year and question whether this has had any influence at all upon the concurrent heatwave in northern Europe. Joe Bastardi has an excellent Patriot Post article on exactly this.

 

Update: 08/07/2019

What a difference a year makes:

D-1Vj03WwAA6kor

 

34 Comments

  1. The Guardian article cited by Jaime is at
    https://www.theguardian.com/environment/2018/jul/27/heatwave-made-more-than-twice-as-likely-by-climate-change-scientists-find

    Unlike Jaime’s article, the Guardian article contains no science, only quotes from scientists. Jaime’s article will never make it into a serious mainstream newspaper because:
    – It’s too long.
    – It’s got lots of science in it, quoting significant claims from established scientific authorities (i.e. peer reviewed papers and the IPCC) in detail (i.e. not sound bites from the press handout.)
    – It demands that the reader pay close attention to the argument for ten minutes or more.

    The claim that all or most warming 1900 to 1950 was anthropogenic has particular significance for policy makers. It suggests that even if we go back to a world where 5% of the population drives a Model T and the rest walk, we’re still not going to solve it.

    Liked by 2 people

  2. A bulletin published today by the Subsidy Farming Research Unit (SFRU) of the University of West Anglia states that its latest, peer-reviewed model of international peanut catastrophic cultivation (IPCC) indicates that the current heatwave in Northern Europe is 37.54 times more likely to have occurred as a result of anthropogenic influence on the climate due to excessive peanut cultivation and consumption. Prof. Schmutzelburger of the SFRU has stated that unless peanuts are replaced by non-allergenic crops such as sun-dried brussel sprouts, we could see an 18-24 metre increases in ocean levels due to allergenic interaction with the ocean depths. “The time to act is now” he said. “We have only a hundred days to prevent unimaginable catastrophe which could result in the complete obliteration of subsidy farming as we know it.”

    Liked by 1 person

  3. Jaime. Thank you for assiduously tearing at the roots of this pile of attribution crap and explaining in relatively easy terms what is wrong with it. No doubt the authors are expecting shedloads of dosh for similar attributions upon demand in the future. They thought the world their oyster. You’ve done a commendable shucking job.

    Liked by 2 people

  4. Excellent, informative article, Jaime.

    Typically, Channel 4 News seems to be going down the Guardian route:
    https://sites.google.com/site/mytranscriptbox/2018/20180724_c4

    Liam Dutton: Now, when you put all these hot bubbles together, around the planet, these heatwaves, you start to see the fingerprints of climate change on them, because you know, it’s hot weather, more intense heatwaves, it’s heat in places where it shouldn’t normally be. So effectively, what we’re describing today is extreme weather, in the future is likely to be the new norm.

    Kathryn Brown: So what we expect in the UK is for heatwaves like the one we’re experiencing, at the moment, to become more frequent and more intense. And if you look back at the heatwave from 2003 as an example, we expect that to become a normal summer by 2040, under a, sort of, medium level of global warming.

    Liked by 4 people

  5. The attribution model is simple:

    Is the particular we are experiencing pleasant?
    If yes: Weather. If no: Climate change

    Like

  6. Weather forecasting is getting quite accurate (at least in my part of the UK and for up to 4-5 days ahead). This leads to the exciting possibility of the Oxford Group being able, ahead of time of being able, ahead of time, of attributing what part of this future experience will be due to CO2 . We can expect future weather forecasters to incorporate a blame game element.

    Like

  7. Jamie,

    Thanks for looking into some of the assumptions and rational for the attribution.

    A new tool is available in evaluating events- such as the attribution noted below-

    https://www.sfchronicle.com/science/article/Scientists-see-fingerprints-of-climate-change-all-13128585.php

    that lets video gamers and modelers alike play with virtual fire in the comfort of their offices:

    https://www2.ucar.edu/atmosnews/just-published/133629/fanning-flames-megafires

    The King Fire was evaluated to develop the model above.

    Real world firefighters have been looking at new technology to assist in their efforts as well-
    https://www.sfchronicle.com/business/article/Wildfires-have-changed-The-technology-to-fight-13131369.php

    “We don’t want to introduce unintended consequences,” said Ralph Gonzales, fire and aviation portfolio manager at the U.S. Forest Service’s Technology and Development Program in San Dimas (Los Angeles County).

    Liked by 1 person

  8. The BBC is at it, too:
    https://sites.google.com/site/mytranscriptbox/2018/20180726_bb

    During the discussion, the Beeb’s Christian Fraser makes a good point, which no-one there really addresses:

    “…of course in developing countries, poverty is linked to the fact that they don’t have energy and electricity – they’re going to get it, fossil fuels are cheaper and will be cheaper up to 2040, so we might change our behaviour but there’s no reason why they would.”

    Liked by 2 people

  9. After three days’ work with computer models, they announced on 27 July that their preliminary analysis for northern Europe suggests that climate change made the heatwave more than twice as likely to occur in many places.

    The “many places” is significant. It is not that a 1976/2018 type heat wave is now any more common, obviously it isn’t since otherwise we would not have had to wait 42 years for a repeat! What they are saying is that the models tell them that when it happens then a larger area will be affected. The maximum temperature could then be 1C warmer as well, although the full reconing will have to wait till the end of August.

    So don’t expect nice sunny weather next summer. It is just as likely to be another wash out!

    Like

  10. “…climate change made the heatwave more than twice as likely to occur..”

    One can’t over-emphasise how weird that sentence is. Does it even mean anything? It’s natural that in a heatwave like the current one, people will turn to the experts and demand answers to questions; questions like: “will there be a water shortage? Or power cuts?” Or: “Will it get cooler if I recycle, or insulate my loft?”

    I’ve been trying to formulate an equivalent question in a similar field where various complex factors are known to influence outcomes, some known, some unknown, and some random. Take football, a subject about which I know even less than I do about climate.

    It is generally accepted that the higher-than-average skill of the English team was a factor in their reaching the semi-finals (a “forcing” as football fans call it.) Now, ask yourself the question that was on every fan’s lips: “How much more likely did the superior quality (or “skill anomaly” measured against the average skill from 1962 to 2002) of the English team make it that they would reach the semi-finals? And how much greater will it have to be to make it twice as likely that we reach the finals next time?

    You have three days, and the use of a billion dollars worth of computers to provide an answer.

    Liked by 1 person

  11. Geoff,

    “…climate change made the heatwave more than twice as likely to occur..”

    “One can’t over-emphasise how weird that sentence is. Does it even mean anything?”

    Peter Stott of the Met Office developed this methodology back in 2003 when he and colleagues ‘attributed’ the 2003 European heatwave to climate change. He basically calculated what is known in other fields as the fraction of attributable risk (FAR), P0/P1, by running computer models with and without anthropogenic forcings for the region in question, and comparing them with observations, finding that the probability of such a rare heatwave was twice what it would be in a non-GHG heated world. I find the whole concept deeply troubling when it is used to ‘attribute’ an extreme weather event to a physical cause without actual physical, empirical evidence.

    Freddie Otto and World Weather Attribution seem determined to construct an entire new business out of rapid attribution of extreme weather, using basically the same methodology. Nice work if you can get it.

    https://www.sciencedaily.com/releases/2004/12/041203133043.htm

    Liked by 1 person

  12. > He basically calculated what is known in other fields as the fraction
    > of attributable risk (FAR), > P0/P1, by running computer models with and
    > without anthropogenic forcings for the region in question…

    This is prime example of the circular reasoning which seems rife within
    the climate modeling community. If you start with the assumption that
    the only degree of freedom in determining temperature is the atmospheric
    CO2 fraction, then it’s hardly surprising that it gets hotter when this
    increases. Given that, it’s also hardly surprising that temperature related
    extreme weather events become more likely despite us having, shall we say,
    a ‘less than perfect’ understanding of how such events occur.

    Soon there will be no need to even run the models as the results will already
    be known. I’m just thankful for the internet so that it’s possible to look back
    and see how virtually all of the absurd predictions of climate calamity have
    failed to materialise (and that there’s no Winston Smith engaged in re-writing
    past predictions to match observed reality)

    Liked by 1 person

  13. Hi JJ

    love these bits –
    “With Otto’s help, Germany’s national weather agency is preparing to be the first in the world to offer rapid assessments of global warming’s connection to particular meteorological events”

    “After three days’ work with computer models, they announced on 27 July that their preliminary analysis for northern Europe suggests that climate change made the heatwave more than twice as likely to occur in many places”

    must piss them of that they have to alternate between “climate change” & “global warming” when we all know its the dreaded MMGW that did it.

    Liked by 1 person

  14. I recognised the name and yes, she is a product of Schellnhuber’s Climate Scares R Us institute, Potsdam. The Potsdam Cuckoos have been circulating for some time.

    Poor Fredie has a limited exposure to science, Potsdam to Oxford ECI and then promotion at ECI, with precious little real scientific experience, but job titles are the thing:
    https://uk.linkedin.com/in/friederike-otto-1476743a

    PhD student/ research assistant
    Potsdam-Institute for Climate Impact Research
    January 2008 – June 2011 (3 years 6 months)

    ECI Environmental Change Institute University if Oxford
    January 2013 – February 2014

    Senior Researcher and scientific coordinator climate prediction.net
    Environmental Change Institute (ECI), University of Oxford
    February 2014 – Present

    http://www.eci.ox.ac.uk/people/fotto.html

    “Friederike (Fredi) is the Deputy Director of the Environmental Change institute and an Associate Professor in the Global Climate Science Programme where she leads several projects understanding the impacts of man-made climate change on natural and social systems with a particular focus on Africa and India.

    Her main research interest is on extreme weather events (droughts, heat waves, extreme precipitation), improving and developing methodologies to answer the question ‘whether and to what extent external climate drivers alter the likelihood of extreme weather’. She furthermore investigates the policy implication of this emerging scientific field.

    Fredi is co-investigator on the international project World Weather Attribution which aims to provide an assessment of the human-influence on extreme weather in the immediate aftermath of the event occurring.”

    A previous Director of ECI is Diana Liverman, a co-author of the recent “Hothouse Earth” PNAS “perception piece”, which had as co-authors, other Potsdam emerging young scientists and several from the Stockholm Resilience Centre, now forging stronger links with Potsdam via the new appointment of Rockstrom as Co-Director of Potsdam on Schellnhuber’s “retirement”.

    Fredi has co-authored very frequently with Chief Climate Attributor and Litigator, Myles Allen of the ECI, who in 2003 started a distributed computing program at Climate Prediction.net whereby interested volunteers could download software onto their domestic computers and run climate simulations, with the results passed back to the project.

    The claim was and is, that with tens of thousands of computers contributing their idle time, the number of model runs would be vastly increased compared to current practice, leading to “reduced uncertainty”. No doubt https://www.worldweatherattribution.org has evolved from https://www.climateprediction.net/, even though it was a complete failure, but is still funded.

    Allen was part of the activist meeting in 2012 to push for litigation against Exxon, described here by James Delingpole and now coming to pass:
    https://www.breitbart.com/big-government/2015/11/05/exposed-green-activists-cooked-rico-conspiracy-big-oil/

    He was a Lead Author on IPCC AR5, WG1, Chapter 10: Detection and Attribution of Climate Change: from Global to Regional. He has co-authored with Peter Stott of the Met Office, who was the Co-ordinating Lead Author for the same IPCC chapter 10 and his co-author for his 2003 Heat wave claims.
    Stott is Head of “Climate Monitoring and Attribution” at the Met Office. Now if your job is to attribute any changes in weather to human induced global warming, isn’t that exactly what you are expected to come up with?

    Allen on modelling: “What we found is these models being fed back to us from our participants show—by just varying the things within the ranges of uncertainty, varying certain aspects of the model within the range of uncertainty—these models are giving us warmings to that same increasing carbon dioxide, ranging to up to over 10 degrees Centigrade. So this is more than double the upper end of the previously accepted range.”

    Allen on Climate Retribution in a BBC interview in 2003:
    “The vast numbers affected by the effects of climate change, such as flooding, drought and forest fires, mean that potentially people, organisations and even countries could be seeking compensation for the damage caused. “It’s not a question we could stand up and survive in a court of law at the moment, but it’s the sort of question we should be working towards scientifically,” Myles Allen, a physicist at Oxford University, UK, told the BBC World Service’s Discovery programme.”

    Fredi has been well taught.

    Check out more here:
    PLAYING CLIMATE GAMES
    The Latest Attempt to Blame Carbon Dioxide for Extreme Weather (2011)
    http://scienceandpublicpolicy.org/science-papers/originals/playing-climate-games

    Liked by 1 person

  15. Thank you for that wealth of useful information Dennis. I didn’t realise Fredi Otto was a PIK protégé. It all looks rather cosy and cliquey doesn’t it.

    Like

  16. Jaime does something here that is taboo in climate “science” but is standard practice in areas like economics and business planning. She questions the assumptions against the empirical data. The conclusion is that in Western Europe trends in heatwaves vary across geographically short distances.
    Local variability does not just impact on the attribution of extreme weather to rises in GHG levels. It also impacts on the ability to construct global average temperatures from land-based surface temperature measurements.
    To construct a global average first requires removing measurement biases due wonky thermometers, station moves, UHI etc. Venema et al 2012 (doi:10.5194/cp-8-89-2012) states

    The most commonly used method to detect and remove the effects of artificial changes is the relative homogenization approach, which assumes that nearby stations are exposed to almost the same climate signal and that thus the differences between nearby stations can be utilized to detect inhomogeneities.

    In early 2015 I looked at various anomalies in temperature data between the pre-homogenised and final data. In a number of cases across the globe it is clear that the underlying assumption does not hold. Temperature trends do vary across quite short distances. Some of these we might be able to find reasons for, others could be part of the chaotic nature of climate. In a concluding article, Defining “Temperature Homogenisation”, brought this together. An implication of temperature trends varying across relatively short distances is that pre-WW2 (when there were far fewer temperature stations outside of Europe & USA), homogenisation will understate the global warming trend in the early period. Allowing for that factor will decrease the non-quantifiable attribution of warming pre-1950 to AGW.

    Liked by 2 people

  17. NBC have published an article today on the subject of extreme weather attribution.

    https://www.nbcnews.com/news/us-news/global-warming-can-make-extreme-weather-worse-now-scientists-can-n901751

    “Researchers no longer hesitate to blame climate change for floods, fires and heat waves. Here’s how the science works.”

    Otto is quoted throughout:

    “Usually scientists have been quiet or said only that ‘This is the kind of event that we would expect to happen more often,'” said Friederike Otto, deputy director of the Environmental Change Institute at Oxford University in England. “But now we can, and will, be able to say more.”

    Many of the researchers in the field are determined to ensure that experts, not amateurs, drive the discussion of unusual weather. “If the answer is not given by scientists, it will be given by politicians or someone with an agenda,” Otto said. “We want to make sure there is scientific evidence in this debate.”

    Experts, not amateurs. You got that? No pesky sceptics with an ‘agenda’ pointing out historical incidences of weather just as extreme, if not more so, or pointing to huge holes in the logic of said experts.

    You really can’t make this stuff up.

    Like

  18. Today is the 5th anniversary of this article by Jaime appearing. Curiously, the 2018 heatwave affected northern Europe, while southern Europe was cool. This year it’s the other way around. Of course, both events are because of (or made much more likely be anthropogenic climate change.

    Liked by 1 person

  19. In 5 years, Otto’s extreme weather attribution methodology hasn’t changed one bit. What strikes me now is that the 2018 northern Europe heatwave wasn’t really that extreme at all and was pretty normal as far as past observations go, most especially further south. It was only in the Arctic that the heat was highly unusual and this almost certainly was as a result of the stationary high pressure cell, which in turn would have been caused by a very northerly jet stream, probably linked to multidecadal changes in the north Atlantic oscillation. As Mark points out, the same jet stream pattern was causing rather cool weather in southern Europe, a region which was not included in the attribution, though it should have been, just like the cool and wet July weather in northern Europe should have been included in Otto’s most recent attribution of the southern Europe heatwave. It’s all such nonsense, but the press keep regurgitating it as ‘science’.

    Liked by 1 person

  20. Mark, Jaime – thanks for a refresh on this, seems it’s still a toss up game with ” heads we win, tails you lose” against anybody that dares to challenge this with even recent facts.

    Liked by 1 person

  21. At Climate Depot:

    Via The Australian: Cliff Mass, professor of Atmospheric Sciences at University of Washington, said the public was being “misinformed on a massive scale”: “It‘s terrible. I think it’s a disaster. There’s a stunning amount of exaggeration and hype of extreme weather and heatwaves, and it’s very counter-productive,” he told The Australian in an interview. “I’m not a contrarian. I‘m pretty mainstream in a very large [academic] department, and I think most of these claims are unfounded and problematic”.

    It’s a shame that Chris Mass appears to be the only scientist consistently calling out these peddlers of pseudoscience. As I’ve said earlier, the scientific community in general has been very slow to respond.

    Top climate scientists rubbish claims July was the hottest month ever – Public being ‘misinformed on a massive scale’

    Liked by 1 person

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.