Wednesday, 30 December 2015

More on the 'Unprecedented' Rainfall Causing UK Floooding

In response to me linking to my latest blog post on Twitter, BBC weatherman and meteorologist Simon King pointed me to this graph of UK annual rainfall  since 1910 which shows a significant increase in trend since 1980 - a point which he made when being interviewed on BBC FiveLive apparently.
uk
Aside from the fact that the UK series is very much shorter than the EWP which goes back to 1766, there is indeed a marked positive trend in UK annual rainfall, starting around 1973, and exceeding that of 1910. Here is the corresponding graph for England:
england
And for Wales:
wales
Though the same trend exists, it is not very pronounced. It is more pronounced for Northern Ireland:
northern_ireland
It is a lot more pronounced for Scotland:
scotland
Thus, it would appear that the overall increase in annual UK precipitation from 1973 is due in large measure to an increase in rainfall in Scotland, with a lesser contribution from Northern Ireland and only very minor contributions from England and Wales.
Cumbria, Lancashire and Yorkshire are of course in England and the Environment Agency is of course responsible only for flood defences in England. So pointing out that rainfall in the UK has increased significantly, when much of that increase has been in Scotland and Northern Ireland, is not that relevant, particularly when the flooding in England has provoked sharp criticism of England's freshwater flood preparedness measures as managed (badly, it seems) by the Environment Agency. It is even more irrelevant in that even the UK annual rainfall data shows there has been no really significant increase in Winter precipitation since 1910, the issue at present being winter rainfall causing flooding.
uk1
2014 again stands out but there is very little overall increase in trend since 1910. Given the fact that winter 2015/16 still has 2 more months to go, it may or may not turn out out to be a particularly wet winter overall.
The lesson here (if there is one) is, don't listen to BBC Radio FiveLive if you want all the facts about current severe flooding and what may be contributing to it.

Monday, 28 December 2015

Flooding in the NW - Is December's Rainfall 'Unprecedented'?

Unprecedented flooding, unprecedented river levels, even - according to Liz Truss, unprecedented rainfall. I thought I would take a look at the HadUKP data to see if the hype from the politicians matches up to reality. Short answer: not really.

Firstly, let's take a look at the December graph for England and Wales, which goes back to 1766 (2015 data is obviously not in yet as there are a few days to go).


What is obvious is that for the first 100 years after 1766, England and Wales were relatively dry for this month. Thereafter, it got quite a lot wetter in general. In 1876, the record for December rainfall was set. Since then, no December has ever been wetter. 2015 looks unlikely to beat the 1876 record of 194mm - it is 120mm as of Dec 26th.



We would need very heavy rainfall widely across England and Wales over the next few days for this record to tumble. But regardless of whether it does or not, it can be seen that, from the beginning of the 20th Century, there has been no clear upward trend in December rainfall in England and Wales. Indeed, if we look at the decade centred on 1910, we can see that this actually had the wettest run of Decembers in the entire period.

So, what about December rainfall in the North West?


Again, no overall increase discernible since about 1910. It is noticeable that from about 1907 to 1920, Decembers in this region were consistently very wet, hence the peak in the decadal average. It has been quite wet in the NW in December since 1980 - though not consistently so - and the record was set in the early 80's, which also might be broken by this year's very wet December. But 'unprecedented'? Hardly. The NW records only go back to 1873. If 2015 rainfall exceeds the early 80's record, it will only be 'unprecedented' within the last nearly 150 years and even then, a fairly isolated extreme occurrence.

Finally, let's look at winter rainfall in general, firstly in the NW:


Once again, we don't see much evidence of a definite trend throughout the last 100 years. The record wet winter for this region occurred about 1992. Will 2015 be wetter overall? Jan and Feb are yet to come. Watch this space. What about winter precipitation in England and Wales as a whole?


It's the same story really. No clear increasing trend in winter precipitation since the very wet period centred around 1910. 2014 stands out as a clear record though, but very wet winters would have to continue for a good few years yet for meteorologists to point to a trend and then for climatologists and politicians to tell us that that trend is anthropogenic in origin.

So, the data (inconvenient as it may be) doesn't match up to the climate change rhetoric being spouted by politicians looking for an excuse for a woeful lack of flood preparedness, plus the hype by green activists and their insistence upon bandying around their favourite term 'unprecedented'.

Friday, 11 December 2015

Did Climate Change (TM) Cause the UK Floods? [Pt. 2]

Here we go again. Another wet UK winter, another round of attempts by politicians and activist scientists to capitalise upon winter flooding to advance their global warming agenda - in the case of politicians, to justify their need to impose swingeing taxation upon homes and businesses in the pursuit of a low carbon economy. First off the blocks, Dame Julia Slingo of the Met Office who last time (Somerset flooding in winter 2013/14) said "all the evidence points to a link with climate change". This time, a little more caution, or should I say, a greater use of weasly words which allow for more wriggle room but which still manage to convey intact the essential message that global warming is wrecking our weather:
"It’s too early to say definitively whether climate change has made a contribution to the exceptional rainfall. . . . . However, just as with the stormy winter of two years ago, all the evidence from fundamental physics, and our understanding of our weather systems, suggests there may be a link between climate change and record-breaking winter rainfall. Last month, we published a paper showing that for the same weather pattern, an extended period of extreme UK winter rainfall is now seven times more likely than in a world without human emissions of greenhouse gases."
Cue Liz Truss, our Environment Secretary un-extraordinaire, who then joined the party by claiming that the pattern of winter flooding was what could be expected due to climate change. The Times Environment Editor Ben Webster begged to differ, as did Christopher Booker, among others. The point is, what caused the floods was our weather, and what drives our weather during such wet winter periods is depressions crossing/forming in the Atlantic, guided by the Jet Stream. To glibly state that 'fundamental physics' suggests there may be a link between heavy rainfall in winter and global warming is ridiculous; it's just more global warming propaganda from the Met Office Chief Scientist. There are a huge variety of factors at play in determining the weather of the British Isles and the changing patterns of that weather and if you want to get to the root of what may be the emergence of a new pattern, you have to examine those factors in detail.
What is virtually certain is that neither the Somerset flooding nor this year's Cumbrian floods are in any way 'unprecedented'. They are notable, they are unusual in the context of 'recent' weather; taken in combination with just two years separating them, they may even be suggestive of a change in the pattern of winter weather over the British Isles, be it temporary or more long term. Certainly, the rain which Storm Desmond dumped on Yorkshire, causing the waterfall over Malham Cove to flow for the first time in what may be hundreds of years, is a precipitation event worthy of note. But of course, what this means is that, long before cars, buses, planes and trains - and coal-fired power stations - were emitting vast amounts of fossil fuel CO2 into the atmosphere, it was raining just as heavily oop north as it is now. And England was probably cooler on average then, if the Central England temperature record is anything to go by. Which brings me to an interesting post by Paul Homewood.
We see from this that Lamb identified a period of increased storminess and elevated precipitation in Northern Europe at the close of the Medieval Warm Period and the beginning of the Little Ice Age. Indeed, Paul Homewood has another enlightening post which points to increased storminess actually during the Little Ice Age. It would appear to be the case that an enhanced temperature difference between mid-latitude air and much colder Arctic air during the LIA was the engine that energised sections of the Jet Stream causing more storms over the British Isles, in combination perhaps with a more southerly tracking (meridional - 'looping' as opposed to zonal - more straight) Jet Stream at that time. This is exactly the type of pattern that appears to be happening now and, in addition to causing very wet, mild, stormy winters, has also given us a notably very cold winter during 2009/10, the 2nd coldest December since 1659 in 2010 and the very cold Spring of 2013. So over the last 6 years, it hasn't all been wet and mild as far as winters go.
England has been cooling on average since about 2005 according to the CET record. The record warm year of 2014 has reversed that trend a little, but not erased it and even with the warm winter we have so far had, 2015 is looking unremarkable. Meanwhile, despite a very strong El Nino, the Pause in global warming continues at least in the two satellite datasets, the Arctic this winter continues to gain ice at a record rate, whilst Greenland remains decidedly frigid and the North Atlantic 'cold blob' actually grows. Growing Arctic sea ice and a colder Atlantic are indicative of the slowing in AMOC [Atlantic Meridional Overturning Circulation] and the downturn in AMO [Atlantic Multidecadal Oscillation], having recently peaked. These two major oceanic/climate indices, of course, are intimately linked.
With solar activity predicted to decline further in SC25, it is entirely feasible that the Northern Hemisphere at least may see a period of sustained cooling, especially if, as seems increasingly more probable, the actual transient climate response to CO2 is less than that which emerges from the climate model runs of CMIP5. With all this in mind, I am persuaded towards the heretical viewpoint that the climate of the British Isles is changing, but not because of CO2; rather, we are headed for a cooler, more turbulent period due to natural variability - entirely in keeping with past variations in the British climate. Who knows, by 2100 we may be growing grapes on Tyneside, but I personally doubt that!

Thursday, 11 June 2015

Walker Lecture - Sir David King Warns of Coming European Heatwaves Using Graph Ending 2003

Ed Hawkins tweeted about Sir David King's talk at the University of Reading Walker Lecture yesterday. David King, if you are unaware is the UK's Special Representative for Climate Change, appointed by the Foreign Secretary in September 2013. He was previously the Government's Chief Scientific Adviser (2000-2007), advocating action on climate change (global warming as it then was) during his tenure. Here is the graph he was presenting to back up his argument that we should expect, by 2050, heatwaves of the type that hit Europe in 2003, to be the norm - under current GHG emissions scenarios.



I questioned Ed Hawkins about the graph as the axes were not very informative and there was no title. Ed kindly informed me that the graph showed modelled Central European temperatures 1900-2100 vs. observed (black), ending in 2003. So why show observations only up to 2003? The peak is very obvious - 2003 was, by all accounts, a phenomenally hot summer in Europe. European summers since have been nowhere near as hot - often fairly cool - hence the suspicion is that later observations were simply left off of the graph because they did not match the projected rapid rise in summer temperatures. I cannot say for sure whether this is the case but I can say that temperatures in Germany (a fair proxy of Central Europe) have failed to live up to the climate model expectations in recent years. First, let's look at a graph of average German summer temperatures 1997-2012:

 

As you can see, 2003 stands out clearly, but the 8 years after show a decline in summer temperatures. 8 years of course, is too short a period to judge for sure whether summer temperatures are declining in Germany but the linear trend since 1997 is only very slightly positive and the polynomial fit shows a rise and decline. 2012 and 2014 were average and slightly above average summers in Germany with just 2013 standing out as significantly warmer than the long term average: June, July & August

On a further note, it woud appear that early Spring and Winter temperatures have actually been significantly declining in Germany since 1998. 

So even though King's graph looks impressive, appearing to match observations with models very accurately, all is not quite as it seems. The hindcast data has very likely been 'tuned' to match observations, notwithstanding the fact that pre 1950 temperature reconstructions have hypothesised anthopogenic GHG forcing largely absent, therefore they are, in effect, modeling mainly natural variability, and - perhaps unsurprisingly - seem to do a fairly good job. It's when the models predict increasing global temperatures under ever greater GHG forcing that they seem to run into trouble, with projected temperatures seen to be increasingly divergent from reality. As can be seen from King's graph, temperatures in Central Europe rose rapidly from 1900 to about 1950, cooled to 1975/80, then rose rapidly again - at what looks to be approximately the same rate as the pre-1950 warming. The post 2003 data is highly likely to be flat/very slightly increasing or even declining - unlike the modeled increase. It is this modeled increase in mean summer temperatures which allows King to claim that the damaging 2003 heatwave in Europe will be 'normal' by 2050 - unless, of course, we drastically curb emissions. To my mind, this is not science, it is political posturing ahead of Cop21 in Paris 2015

Tuesday, 24 March 2015

A Brief Conversation with Michael Mann

It has probably not escaped climate change sceptics' notice that there is a new Nature paper by Stefan Rahmstorf, Mann et al on the supposed 'unprecedented' slowdown in the North Atlantic thermohaline circulation (THC) during the 20th Century. The Washington Post and the Independent have picked up on it in typical alarmist media fashion and Rahmstorf himself is promoting it over at RealClimate. Mann is hawking it on Facebook like it was The Day After Tomorrow.

Now Mann has blocked me on Twitter but hasn't got around to doing so on Facebook (might have by now!), so I thought I would add a comment or two to one of his posts on this new paper. As you can see from his Facebook profile, he really is going overboard promoting this study in typical alarmist fashion.

Anyway, imagine my surprise when he actually replied to my first comment, so I added a reply and quickly took a snapshot of the convo guessing that he would do exactly what he did - remove the comments. Here it is.



A (brief) Conversation with Michael Mann - quickly deleted


Add caption

Monday, 9 March 2015

Climate Change by Numbers - Response Pt. 2

Onto the last of the last of the guest presenters, David Spiegelhalter, who helps people like the NHS predict the future. I trust he is better at that than he is at assessing the past! 1 trillion tonnes of carbon he informs us, is our budget beyond which we can expect 'dangerous' climate change (2C rise above pre-industrial levels). We've already burnt around a half a trillion tonnes since the beginning of the Industrial Revolution and "that's given us almost a degree of warming", says Spiegelhalter. So what he is saying in effect is that Hannah's 0.85 degree temperature rise since 1880 is all down to the burning of fossil fuels. Even the IPCC does not go this far. They say:

"Greenhouse gases contributed a global mean surface warming likely to be between 0.5°C and 1.3°C over the period 1951–2010, with the contributions from other anthropogenic forcings likely to be between –0.6°C and 0.1°C, from natural forcing likely to be between –0.1°C and 0.1°C, and from internal variability likely to be between –0.1°C and 0.1°C.
Together these assessed contributions are consistent with the observed warming of approximately 0.6°C over this period. [10.31, Figure 10.5]"

This ties in with the IPCC AR5 attribution statement:

"It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together. The best estimate of the human-induced contribution to warming is similar to the observed warming over this period."

Greenhouse gas concentrations in the atmosphere have only been measured really accurately at Mauna Loa since about 1960, when 310ppm was registered. Before that, scientists rely upon low resolution ice core data which appears to show a very gradual increase from a baseline of about 280-290ppm in 1880. It's only during the 1950s that atmospheric CO2 concentrations really take off.




Note the very rapid 1910 to 1945 warming followed by the sharp cooling to 1950 - all when CO2 was not much above pre-industrial levels and increasing only very gradually. Therefore, we must conclude that natural variability was in charge prior to 1950, certainly with respect to major ups and downs. AGW purists might wish to claim that the general upward trend in temperatures since 1880 is also down to anthropogenic CO2 but then they would have to explain also the longer general upward trend in global temperatures since the end of the LIA, without resort to more plausible natural (solar) influences. In summary, it's quite likely that the increase in global temperature from 1880 to 1950 can be mainly attributed to natural (internal and external) forcings. But the BBC's Climate Change by Numbers would have its viewers think otherwise.

The so called Monte Carlo method of statistically predicting the most likely future outcome of anything from Formula 1 races to NHS medical policy/practice to CO2 induced climate change is a powerful mathematical tool involving multiple repetition of many different scenarios, Spiegelhalter tells us. I am sure it is - given the right data. The climate model runs cluster around a 'most likely' multi-model mean which suggests that we can expect to hit the 2C 'dangerous' threshold within the next 30 years or so if we burn all of that remaining 1/2 a trillion tonnes of carbon. What he neglects to mention of course is that the multi-model mean is way ahead of actual observed temperatures and in fact the vast majority of all of the climate models run significantly warmer than reality. Clearly, something is amiss with the models. The Monte Carlo method is faithfully predicting the most likely future outcome, but an outcome probably based upon incorrect assumptions/data about the real climate, about natural oceanic oscillations, cloud feedbacks, water vapour feedbacks, solar variability, and so on. However complex the climate models are, they are mere simplifications of what is actually going on in the coupled ocean-atmosphere system - and it appears that they are simply wrong. The longer the 'faux pause' continues, the more wrong they get.

Even if the models eventually prove to be right "why should we worry about a rise of two degrees Celsius?" asks Spiegelhalter. Because of weather, extreme weather to be more precise. Climate scientists tell us we can expect more frequent droughts, floods and storms - though the evidence for such thus far is much less than convincing (see Roger Pielke Jr.'s work). But that's only part of the problem. Environmental engineers use a technique called Extreme Value Theory to allow for the occurrence of really extreme (e.g. once in a thousand years) events which will test their structures to the absolute limit. This relies upon collecting a lot of data about past extreme events, but in a warming world, Spiegelhalter tells us, this data becomes rapidly obsolete and hence the predictions of extreme value theory, which rely upon overall stable conditions, become increasingly unreliable. Hence our ability to plan for such events is reduced, placing society at risk. This sounds all quite reasonable but again, it is reliant upon the unproven hypothesis that patterns of extreme weather events will change/are changing due to global warming, which in fact is also not happening - for the past 15 years or so, global surface temperatures have not increased at anything like the pace that was predicted by the climate scientists' models; indeed they have not increased by a statistically significant amount at all in any dataset.

Climate Change by Numbers? A for effort, C- for attainment.
Sorry, you're gonna need more numbers. #CCBN2


Friday, 6 March 2015

Climate Change by Numbers - Response Pt. 1

Well, after all the fuss and the feverish anticipation of this BBC program, I thought I had better get round to watching it and I have managed two thirds of the way through so far! I thought initially I might make a few (probably critical) comments on a blog post but, the more I examine this program, the more I feel I need to respond to specific (and some very unspecific) points made within. So let's start with the grand aim of the program  - the goal of convincing us all that climate change (TM) is real via the wonder of mathematics and the analysis of three simple numbers: 0.85C, 95% and 1 trillion tonnes. Each number is assigned to one of three mathematicians (statisticians) - Dr Hannah Fry, Prof Norman Fenton and Prof David Spiegelhalter - respectively. They are "three numbers which represent what we know about the past, present and future of earth's climate", Professor Fenton tells us right from the off. Oh dear, not an auspicious start!

Hannah Fry (and four-legged friend Molly) tackle the mysteries of measuring global temperature and why 0.85 degrees is so important (or maybe not, as the case may be). Cue bucket talk - canvas and wooden - then on to an explanation of how errors are teased out of the historic global temperature record, using Kalman filtering. Now this, according to the program, is what enabled men to land on the Moon in 1969 and, under a different title - homogenisation - is what enables climate scientists to 'clean up' past temperature data and iron out irregularitiies. All sounds perfectly reasonable. But Hannah Fry tells us that "other people" will "inevitably start accusing" climate scientists of "building bias" into their data, once the raw data is cleaned up in this way and replaced with homogenised data. Those 'other people' sound like those nasty sceptics/deniers/contrarians which your mother warned you about. They also sound suspiciously like those conspiracy theorists which Lewandowsky warned us about - you know, the kind of people who will question the unquestionable - the global temperature record, built as it is upon the solid science which gave us the Moon landings. This subtle re-working of the infamous take home message of LOG12 - whether planned or merely coincidental - was not lost on me; but then again we ideational conspiracists often see hidden messages that others fail to detect!

Next we get on to data 'infilling' (named Kriging, after a South African gold-miner). Then swiftly on to the pause which is not a pause, which does or does not exist, and anyway, which is just one of a number which were always expected to appear to briefly interrupt the general man-made warming trend. No matter that the 'expectation' of a 15 year hiatus was just once in every 375 years of model runs (p.8)! But, you know, limited time and all; the program just couldn't fit the explanation of this mere 'detail' into its breathless schedule, even though it tediously slow-burned its way through endless minutes of story-telling build-up to the main points. "Mathematical manipulation of the raw data can look like fiddling the figures", Fry tells us, and the techniques which climate scientists have used are well understood and "all lead in the same direction". I'm sure this phrase was an unfortunate choice, but yes, I think that's one of the main contentions of some sceptics - the fact that, in general, these adjustments do all lead in the same direction, i.e. more warming! Summing up then, Hannah concludes that, even with all the uncertainties in the past temperature record, the scientific consensus is that the earth's temperature has risen by 0.85C since 1880.

This is fairly uncontroversial, even amongst climate change sceptics, who generally acknowledge that the planet has indeed warmed overall (but certainly not uniformly, either spatially or temporally), since the 19th century and indeed since the Little Ice Age ended. The controversial part, as Hannah points out, is how much we are responsible for that warming, which naturally leads onto the second guest presenter, Prof Norman Fenton. He will tell us how the IPCC are 95% certain that we are responsible for virtually all of the warming, but it's not 'this' warming (0.85C), it's the warming that has taken place since 1950. So why bang on about the 'almost a degree' warming since 1880 when it does not relate directly to the second crucial figure? Why not bang on about the 'slightly more than half a degree' warming since 1950 which the IPCC tells us is all down to CO2? I'll leave the reader to speculate on that.

Norman Fenton is a Spurs fan and a down-to-earth Londoner - so we have one thing in common at least! In order to explain the mystery that is an IPCC climate change 'attribution study', he chooses to model the performance of Premier league teams and finds that, in amongst a variety of factors, one stands out as having a very marked influence upon performance - the wage bill. Fenton creates a simple model which predicts the performance of teams based upon various factors. First he shows us the rather good fit of model vs. actual performance for Man City. Impressive. Then he shows us Liverpool:



FireShot Screen Capture #015 - 'BBC iPlayer - Climate Change by Numbers' - www_bbc_co_uk_iplayer_episode_p02jsdrk_climate-change-by-numbers



Hmmm, not quite so impressive, but unabashed, Prof Fenton says that this good model fit is "true for all of the teams in the Premier League". He goes on: "Now I know I can trust my model, we can move on to the clever bit". He really does seem to have this IPCC attribution study nailed! The "clever bit" is isolating what factor, if any, has a dominant effect upon any team's performance, which turns out to be the wage bill. This is 'attribution' and the principle is the same for climate change, albeit the latter situation is far more complex, with an extremely complicated web of interacting variables needing to be taken into account rather than just a few. Most of these variables are related to natural (internally and externally forced) climate variability but, nevertheless, the IPCC has looked at them all and concluded that their net effect is near zero since 1950 - hence they can attribute to CO2 emissions with startling 95% certainty virtually all global warming since 1950 (but not since 1880). How, you may ask? Well, firstly because of depressed Swedish physicist Svante Arrhenius. He it was who, in a roundabout sort of way, first showed, "using maths", that a doubling of CO2 could warm the globe by 4C via the so called Greenhouse Effect. So that's one crucial piece of 'evidence' in place; now just to 'prove' the assertion that natural variability has played virtually no part in recent global warming.

"If the cycles of the Sun were a major cause of the rise in temperature we've measured, then what we would see would be all the layers of the Earth's atmosphere warming together like this":


FireShot Screen Capture #016 - 'BBC iPlayer - Climate Change by Numbers' - www_bbc_co_uk_iplayer_episode_p02jsdrk_climate-change-by-numbers


I must admit, this is a new one on me which I must look into more. Even more interestingly, Fenton then presents us with what he says has actually happened over the last 60 years:


FireShot Screen Capture #018 - 'BBC iPlayer - Climate Change by Numbers' - www_bbc_co_uk_iplayer_episode_p02jsdrk_climate-change-by-numbers



What this shows is accelerated warming of the troposphere and cooling of the stratosphere. The stratosphere has indeed cooled since the 1960s - though this cooling trend has halted since the mid to late 1990s.

  
global_upper_air



Now, Fenton informs us that the that the models show that this pattern (tropospheric warming/stratospheric cooling) only fits well with anthropogenic CO2 being the principle cause of recent global warming. What he neglects to mention is:
  1. The tropical mid-tropospheric 'hotspot' which we clearly see in his 'actually happening' representation above has actually failed to materialise, even though it is one of the key predictions of the climate models. There has been no observed accelerated warming of the mid troposphere over the tropics. So even though the troposphere as a whole has warmed and the stratosphere cooled, the mid troposphere has not warmed significantly compared to the surface.
  2. There is an alternative (anthropogenic) explanation for stratospheric cooling and surface/tropospheric warming which involves CFCs. Basically, the hypothesis is that accelerated ozone loss caused by CFCs in the stratosphere has resulted in cooling of that portion of the atmosphere whilst, at the same time, UV energy which would normally be absorbed by stratospheric ozone (warming the stratosphere) has passed straight through to warm the lower troposphere. This might explain the 'pause' in global warming from about 1998 and the corresponding 'pause' in stratospheric cooling as a direct result of the decline in the concentration of ozone depleting CFCs in the upper atmosphere since the international adoption of the Montreal Protocol in January 1989.
There are also explanations which invoke natural causes for some or even most of the warming we have seen since the 1950s. None of this gets an airing though. I guess, again, lack of programming time probably precluded the mention of these facts, though apparently not the mention of the 'facts' of declining Arctic sea ice, increasing heatwaves and ocean acidification as further 'compelling' evidence for the 'human fingerprint' of global warming. If the viewer is not convinced up to this point, then the almost exact match of the graph of actual warming (yellow) vs. modeled (red - anthro + natural) and the very poor fit of natural only (green) modeled warming is a killer:



FireShot Screen Capture #019 - 'BBC iPlayer - Climate Change by Numbers' - www_bbc_co_uk_iplayer_episode_p02jsdrk_climate-change-by-numbers


Alas, once again the tight programming schedule does not allow Fenton to explain to his viewers about the assumptions inherent within this graphic, namely a high-end climate sensitivity to CO2 (increasingly looking doubtful), negligible solar influence and internal variability which has had nearly zero net effect since about 1950. Those assumptions are challengeable.

Fenton says that the AGW fingerprint in global warming is as clear as the wage bill in his Premier league performance model; in fact so clear that the IPCC assigned 99% certainty to it. It would seem that the poor dears were so embarrassed by the devastating clarity and certainty of their science that they downgraded their figure to merely 'greater than 95%' just in case there were any hidden errors for which they had not accounted! Now ain't that humility and honesty! Professor Norman Fenton appears to have had a bit of a rethink with regard to his presentation in the program. In his blog post he comes across as much more guarded about the role of humans in climate change than as portrayed in the program. Well worth reading.

Sunday, 8 February 2015

Science and Statistics - An Unholy Alliance?

I came across this very interesting article the other day, written in 2010. It basically reinforces my own long held sense of unease with regard to statistical analysis. My reaction against statistics started early, in school, when first I was presented with its somewhat bizarre pseudo-mathematical methodology and nomenclature. My early rejection of the subject was more visceral and emotive rather than common sense factual and logical. It just hit a raw nerve with me somehow and all these years later, reading this article by Tom Siegfried, I begin to see perhaps why. So let me begin by quoting a few passages from Siegfried's text:

 "During the past century, though, a mutant form of math has deflected science’s heart from the modes of calculation that had long served so faithfully. Science was seduced by statistics, the math rooted in the same principles that guarantee profits for Las Vegas casinos. Supposedly, the proper use of statistics makes relying on scientific results a safe bet. But in practice, widespread misuse of statistical methods makes science more like a crapshoot."

So it's not statistics itself, but the misuse of this analytical toolbox which is the problem. To this I would add over-reliance, especially evident in the field of climate science. Too often in the peer reviewed climate science literature we find papers which base their conclusions almost totally on the results of some new statistical analysis/re-analysis of existing data. In order to fully appreciate what they are saying and, more importantly, in order to question what they are saying, one needs to be an expert not primarily in climate science, but in statistical analysis.

"Statistical tests are supposed to guide scientists in judging whether an experimental result reflects some real effect or is merely a random fluke, but the standard methods mix mutually inconsistent philosophies and offer no meaningful basis for making such decisions. Even when performed correctly, statistical tests are widely misunderstood and frequently misinterpreted. As a result, countless conclusions in the scientific literature are erroneous, and tests of medical dangers or treatments are often contradictory and confusing."

This does not inspire confidence.

"Experts in the math of probability and statistics are well aware of these problems and have for decades expressed concern about them in major journals. Over the years, hundreds of published papers have warned that science’s love affair with statistics has spawned countless illegitimate findings. In fact, if you believe what you read in the scientific literature, you shouldn’t believe what you read in the scientific literature."

With the increasingly pervasive use of statistical analysis in climate science, backed up by increasingly complex computer models, the above statement is magnified 10-fold in consideration of the results of the latest peer-reviewed scientific research. Much of this said research is aimed at pointing the finger at man as being responsible for the majority of post 1950 global warming, claiming also that we will continue to drive climate significantly into the future. Yet much of it is based upon statistical reanalysis of existing data.

"Nobody contends that all of science is wrong, or that it hasn’t compiled an impressive array of truths about the natural world. Still, any single scientific study alone is quite likely to be incorrect, thanks largely to the fact that the standard statistical system for drawing conclusions is, in essence, illogical. “A lot of scientists don’t understand statistics,” says Goodman. “And they don’t understand statistics because the statistics don’t make sense.”"

A perfect illustration: the recently released paper by Marotzke and Forster. The main impetus for the paper was to address the apparent mismatch between climate models and real world observations (in particular the 'pause') which sceptics use to question the validity of the AGW theory. The paper concludes:

"The differences between simulated and observed trends are dominated by random internal variability over the shorter timescale and by variations in the radiative forcings used to drive models over the longer timescale. For either trend length, spread in simulated climate feedback leaves no traceable imprint on GMST trends or, consequently, on the difference between simulations and observations. The claim that climate models systematically overestimate the response to radiative forcing from increasing greenhouse gas concentrations therefore seems to be unfounded."

So, climate models do not overestimate the response to GHG forcing, even though the CMIP5 model mean is increasingly diverging from actual recorded global mean surface temperatures (GMST) and even though almost all models clearly run 'too hot' when compared with actual GMSTs. Apparently, this impression is not borne out by statistically analysing the past temperature record and comparing that with the models [?] It's opaque to me and probably a lot of other people besides. Nic Lewis thinks it is plain wrong, and says so at Climate Audit, laying out his reasons. He gave Marotzke and Forster the opportunity to reply to his concerns about their paper but they failed to respond before Nic Lewis published at Climate Audit. Instead, they have chosen to issue a rebuttal of Lewis' rebuttal at Climate Lab Book here. I've no idea who will eventually be proved to be right or wrong in this kerfuffle, but I quote from statistical expert Gordon Hughes (Edinburgh University), being one of two people whom Nic Lewis asked to review his conclusions about M & F, 2015:

"The statistical methods used in the paper are so bad as to merit use in a class on how not to do applied statistics.
All this paper demonstrates is that climate scientists should take some basic courses in statistics and Nature should get some competent referees."  

The wider point here is that we have yet another paper which relies almost exclusively upon statistical methodology to draw conclusions about the real world - another paper which may have to be withdrawn. Science - and climate science in particular - is suffering from the all too pervasive influence of staistics. There is a place for statistics in the analysis of real world data and even I must (reluctantly) acknowledge this. However, science has, as Tom Siegfried points out, become "seduced" by the false promise of this "mutant" form of mathematics and is suffering from its misuse and its overuse.