Tractatus Logico-Climaticus

It’s a century since Wittgenstein wrote his Tractatus Logico-Philosophicus, in the trenches, where he spent the Great War fighting for Austria against the allies. It was published in England in 1922, with a generous introduction by Bertrand Russell, who had spent part of the same war in prison for opposing the same war. His praise was doubly generous, given that the whole point of the Tractatus is to prove that the life’s work of Bertrand Russell, whose lectures Wittgenstein had briefly attended while he was studying engineering at Cambridge, was a waste of time.

The Tractatus famously opens with Proposition 1:

The world is everything that is the case.”

And then hammers home the point with propositions 1.1, 1.11 and 1.2 :

The world is the totality of facts, not of things. The world is determined by the facts, and by these being all the facts. For the totality of facts determines both what is the case, and also all that is not the case.”

From this unpromising beginning he manages to deduce, in a few short pages, a logical proof that philosophy is pointless, a big mistake. But that you could never prove it. All you could do is point it out.

6.53 The right method of philosophy would be this: To say nothing except what can be said, i.e. the propositions of natural science, i.e. something that has nothing to do with philosophy: and then always, when someone else wished to say something metaphysical, to demonstrate to him that he had given no meaning to certain signs in his propositions. This method would be unsatisfying to the other—he would not have the feeling that we were teaching him philosophy—but it would be the only strictly correct method.

6.54 My propositions are elucidatory in this way: he who understands me finally recognizes them as senseless, when he has climbed out through them, on them, over them. (He must so to speak throw away the ladder, after he has climbed up on it.)

Instead of throwing away the ladder, let’s climb down from these dizzy heights and consider one part of the real world, or rather the facts of which it is constituted, or rather the propositions which describe these facts. Namely, the propositions describing global temperature.

(Let’s leave aside for the moment the argument that there’s no such thing as a global temperature. The same argument can be used against anything that’s measured, at some level of precision.)

My school atlas used to have pages of temperature graphs, showing the average temperature change month by month for a number of spots on the globe. So I learnt that the average temperature in London varied from 5°C in January to 19°C in July, while in Singapore it stayed steady at 28°C all year round. Nobody mentioned the change year by year from 1860 to the present, because it was too tiny to notice. Nowadays, everyone’s fascinated by the change in average global temperature since 1860, and no-one’s bothered by the difference in temperature between London and Singapore, since your hotel will have decent heating and air conditioning at both ends of your trip. The facts haven’t changed, (Singapore is still hotter than London) but there’s been a revolutionary change in the choice of which propositions describing those facts get put forward.

Facts don’t change, but the propositions that describe them do. And the propositions about past temperatures are notoriously malleable. As Steve Goddard points out tirelessly, the NOAA, which produces estimates of average global temperatures, is forever changing its estimates of past temperatures. Not all past temperatures, of course. Two or three centuries ago, there were just a few thermometers at a few universities, and presumably everything that we can ever know about them is already known. (And as for temperatures before then, only the tree rings know, and their utterances are frequently mysterious, but correctly interpreted in the peer reviewed literature, until proved otherwise.) So there’s a kind of fixed point at the date when global temperature measurement became reliable, which is conventionally fixed at 1860. And of course, the present forms a second fixed point, since our current estimate of global temperature is as good as we know how to make it. (And the same goes for very recent temperature estimates, since it would be silly to announce on the second of January 2018 that the figure for 2017 announced the day before already needs adjusting. It would look sloppy.) But between these two fixed points, the temperature graph is a kind of loose string which can be fiddled with at will. So, while scientists are able to announce with certainty that there’s been a rise of 0.9°C in the past 150 years or so, the slope of their floppy string can vary quite significantly from decade to decade.

Of course, variations from one year to the next are of no interest, since they zigzag wildly. A jump or a drop of 0.2°C is not unusual, which, if projected forward, would amount to a rise or fall of 20°C in a century – a clear absurdity. So discussion tends to focus on decadal or multidecadal changes. Here SkepticalScience is your friend. They have a neat little tool (no, I don’t mean him) for estimating average temperature change over any period you like. By playing with it (or cherrypicking, to use the scientific term) you can get a recent decadal temperature rise of anywhere between 0.1 and 0.2°C, which represents a wide margin of error, but doesn’t get near the 3 or 4°C temperature rise by the end of the century which is used to frighten the masses and their bosses.

You can see the problem this poses if you imagine the temperature graph from now to 2100 as another loose piece of string fixed at both ends, one in the present and the other 3°C higher in 82 years’ time. Temperatures need to start rising at about 0.37°C per decade to get there – right now. That’s almost twice as fast as the biggest decadal rise you can cherrypick in the recent past. Even if you count the 3°C rise starting from about 1950 when our CO2 emissions began to take off, you still need temperatures to rise by more than 0.3°C per decade pretty sharpish. And of course, the longer you leave it, the sharper the eventual rise has to be. In other words, you need a hockeystick. Do any of the models predict a hockeystick? I don’t think so. How could they? What could possible cause it? And where’s the elbow? This has an interesting implication. Even if temperatures do rise by the kind of amount the models predict, they’d still be wrong, since there’s nothing in their models to predict the sharp elbow needed to get there.

But let’s leave the future, which is just speculation, (and my speculation is no better than the speculation of the world’s greatest minds, i.e. useless) and return to the past (which, unlike the future, can be changed at will by experts.) Remember that average temperatures for past years have been regularly changed, almost always downwards. It should be possible for a more mathematical mind than mine (Kevin Marshall? Paul Homewood?) to calculate for any given year in the past, how much, on average, the global temperature has been changed downwards in any given following year. It could even be worked up into a SkepticalScience-type calculator which we could offer to John Cook and his fellow Skeptics as a present. And the really neat thing about this little tool (no, I don’t mean him) is that it could be used to predict how much the temperature for 2017 (or 2018 for that matter) would be likely to be reduced in any given future year. So we could adjust it straight away, instead of waiting for some future head of NOAA to do it for us at some distant date, when it will no longer be newsworthy.

There. That’s my Tractatus Logico-Climaticus. Of course, there’s nothing philosophical about it, but it does obey Wittgenstein’s dictum at 6.53 above about only saying what can be said, i.e. the propositions of natural science. So I hope someone cleverer than me will actually do the sums and tell us what the temperature in 2017 will have been adjusted to in 2050. My grandchildrens’ lives may depend on it.

Meanwhile, I’ll leave you with the last proposition of Wittgenstein’s Tractatus:

7. Whereof one cannot speak, thereof one must be silent.

18 Comments

  1. The very much missed Prof. Philip Stott offered a Tractatus Logico-Climaticus back in the 2000s. I can’t find the original, and I think the logical/mathematic notation might not have survived.

    1 The world is wholly contingent
    1.1 The world is the totality of changes
    1.11 The world is determined by the changes, and by there always being change
    2 It is a river-bed proposition throughout all geology that climate changes
    2.1 Climate change is a tautology
    2.11 Climate is always either ‘warming’ or ‘cooling’
    2.12 Climate changes with or without human contingencies
    2.13 Human influence neither creates nor halts the fact of change
    2.2 The world divides into the facts of change
    2.21 Doing something and not doing something are equally contingent
    3 Causality is not a law which Nature obeys
    3.1 Causality is the form in which propositions are cast
    4 Monocausal propositions deny contingency
    4.1 Monocausal propositions are but the witchcraft of their time
    4.11 Monocausal propositions lead to authoritarian actions
    5 Nothing is so difficult as not deceiving oneself
    5.1 Human monocauses are inherently self-deceptive
    5.11 The world is independent of my will
    5.12 There is no logical connection between my wish and the world
    5.121 fx = “x is controllable”; “fx is false for all values of x”
    5.122 How likely then is ($x).fx?
    5.123 [p,£,N(£)]
    6 Logical climate space can be reduced to three words
    6.1 Climate always changes
    6.11 Climate is uncontrollable
    7 What we cannot speak about we must pass over in silence.

    Liked by 5 people

  2. As a geologist I feel that the global temperature must include temperatures of the interior, including those of the inner core at 5,700 K (5,430 °C).

    Liked by 2 people

  3. Thanks Ben
    Sorry to have stolen Philip Stott’s title, though not his ideas, since his Tractatus is on the nature of science – a different tack, or tract.
    My point is a logical one. They can change past temperatures all they like, or all they can get away with, but they can’t – logically can’t – change current temperatures, because, although they may not be what they say they are, what they say they are must logically be what they say they are – for now.
    It’s only when current temperatures become past temperatures that they can be changed. And we have the data to predict how they will be changed, based on past changes. So we can estimate the future adjusted current temperature now, and save ourselves a lot of future adjustment.

    Liked by 2 people

  4. I don’t think Philip would mind in the slightest if you had! (Not least ‘cos he nicked it himself.)

    It ends up in the same place. Whereof one cannot speak, thereof one must be silent.

    Serendipity, perhaps, or just a lucky guess of the algorithm, then, in the related post section under the above post, is a link to Jaime’s article ‘When Did Quantitative Risk Assessment Become “Key Evidence”?’

    I wonder if the IPCC could be ‘Very Confident’ that it is ‘Very Likely’ that Whereof one cannot speak, thereof one must be silent.

    I doubt it.

    NASA’a interpretation would perhaps be not to speak about it, but to adjust the data, all the same.

    Like

  5. It ends up in the same place. Whereof one cannot speak, thereof one must be silent.

    In the same place, but by a different route, and how you get there counts in philosophy. Compare what would happen if a polling company adjusted past figures for voting intentions in order to bring them into line with voting patterns in real elections. Clearly, they could no longer claim that they were “real” voting intentions recorded at the time, but something else, possibly virtual intentions in a world unsullied by unknown sources of random variation. Perhaps NOAA are, consciously or unconsciously, doing the same thing, and their adjusted temperatures are accurate records of what the temperature would have been in a world of monocausal climate change. If so, we have a new kind of metaphysical entity. We could call it Bayes’ Posterior and assign it the Greek letter Psi, which looks the part.

    Climate science is filling the blank spaces on the climate map with these fabulous beasts. There’s already the year that would have been the hottest if el Nino hadn’t made another year hotter, and no doubt soon a year that would have been the hottest if la Nina hadn’t made it cooler. And there’s the weather event that was made more likely to happen than it was the last time it happened by global warming.

    There’s another profound observation that comes to mind when considering climate data, by another Cambridge philosopher – Peter Cook, in the National Gallery:

    “You see that Rubens over there Dud. That cost us ten million quid. Or ten and sixpence. Or somewhere between the two.”

    Like

  6. Geoff says

    It should be possible for a more mathematical mind than mine (Kevin Marshall? Paul Homewood?) to calculate for any given year in the past, how much, on average, the global temperature has been changed downwards in any given following year.

    Given enough time I could create a raw average for the land surface temperature data. But it would not be terribly meaningful. There are about 5000 available data sets. Even then, one would have to weight each data set. The weighting would be different over time depending on the number of stations in the surrounding area. Before 1950 the temperature data sets were much sparser.

    Like

  7. Oldbrew
    It’s the car salesman’s question:
    “Did you have a figure in mind?”

    Like

  8. The past is not what it used to be. Take GISS for instance:

    In 10 years, the warming between Jan 1910 and Jan 2000 has increased from 0.45C to 0.69C, a 50% increase in warming over a 90 year period due entirely to adjustments. Amazing!

    So January 2000 got a whole lot warmer and January 1910 got a bit cooler. Remember 2000? In March of that year – a couple of months after the January which has since been nicely warmed by adjustments – David Viner of the CRU said children just won’t know what snow is. 18 years later, even kids in the Sahara desert know what snow is, but the Independent adjusted that particular news item out of existence, pretending that they never reported on it, But sadly for them and David Viner and climate ambulance chasers the world over, the internet remembers everything.

    https://wattsupwiththat.files.wordpress.com/2015/11/snowfall-thing-of-the-past.png?w=1000&h=

    Liked by 2 people

  9. The strange thing is, David Viner had no idea when he made that prediction in Mar 2000 that January 2000 would turn out to be 50% warmer than 90 years previously, so you’d have expected his prediction to be even more true and for kids to have known even less what snow is, and for longer . . . . . except the US has been buried in the stuff, the global warming conference in Davos has been buried in the stuff, Scotland and Northern England have seen plenty this winter and about the only place on the planet which hasn’t seen any for a while is centred on my postcode!

    Liked by 3 people

  10. MANICBEANCOUNTER
    I certainly wasn’t expecting you or anyone to go over 5000 data sets. Either annual global averages would do, or a select few data sets chosen for completeness and longevity.

    I think JAIME JESSOP’s graph has the answer to my riddle, but that last glass of wine is preventing me from seeing it clearly. What we need to know is, given the tendency of the temperature anomalies of years in the distant past (like 1910) to fall, and the tendency of recent years (like 2000) to rise over time, what will the likely anomaly of 2017 be in, say, 2037, when your grandchildren will be the age you would have been if you’d been born a bit later, and worrying if they can keep up the payments on the solar panel, or would do better to emigrate to China?

    I can see that the answer to my question is more complex than I thought, since, in my example, the string is now anchored at a fixed point in 2037, and the temperature in 2000, which in 2017 was being adjusted upwards to keep the slope steep, might now be being adjusted downwards, for the same reason. But there is surely a mathematical solution, since what is true of 2017 will also be true one day of 2037, in, say, 2057, when your greatgrandchildren will be sorting recycled waste imported from China, and wondering if they’ll ever see the day when they pay off the instalments on the treadmill that allows them to charge the battery to power the telly so they can see the latest instalment of Game of Thrones.

    I’m afraid I can’t formulate the kind of equation I need, since it involves those f(x) things and the big Sigma sign, as well as recursive thinking, which is when I switched off in advanced maths, and took to reading self improvement books under the desk. It didn’t work.

    Liked by 1 person

  11. Re the David Viner thing, I like it when they have
    those important elite climate get togethers that
    so often the weather doesn’t cooperate- is it
    the Pathetic Fallacy?

    Like

  12. This has taken over from “the Great Game”, hasn’t it? Russia, the UK and the USA have all just about given up trying to control the frontier between India and Greater Asia, so let’s play at climate change instead. One side sells gas and builds dirty power stations. The other builds windmills. And China builds eco-destructive hydro plants

    Like

  13. Re David Viner although he gave the money quotes, it should be noted that he was not alone. The newspaper article was also based on material from David Parker at the Hadley Centre who seems to have got off almost scot free.
    Furthermore David also gave another quote that may not be exactly untrue – With heavy snowfall returning from occasionally, he said, “we’re really going to get caught out. Snow will probably cause chaos in 20 years time.” (= 2020)

    Like

  14. Beth. “Cooperating” weather would seem to be a pathetic fallacy, however the link between Al Gore and unexpected adverse weather conditions seems established beyond a shadow of a doubt from multiple occurrences. You would think this would be utilized to good (or evil) effect, but perhaps his power is unstable.

    Liked by 1 person

  15. Geoff at 26 Jan 18 at 10:16 pm
    For me, Jaime’s graph, from the excellent Climate4you website is not a complete answer.
    Take a look again.

    This shows evidence that the anomaly has changed significantly, not whether it has improved. You and I might agree that it has become worse, but NASA GISS provide papers on why they think it has improved. Can we provide a persuasive argument to demonstrate that we are right?
    A brief explanation as to why I think we are right?
    The global temperature anomaly for land* is a weighted average based on temperature data measurements at specific locations, that are far from evenly spaced. Going back in time, the number of thermometers is fewer and fewer. Also, most of the land surface is vastly under-represented even today.
    The data is quite poor, for a number of reasons, with some clear anomalous data. Data homogenization is a process used to eliminate the clear anomalies. It is an iterative process of pairwise comparisons of data sets. This homogenization process assumes that nearby stations are exposed to almost the same climate signal, therefore, the differences between stations are due inhomogeneities. There are issues with this assumption in particular and climate in general.

    1. I have looked at specific examples where “nearby” stations are not exposed to the same underlying climate signal. Homogenization adjusts genuine variations in the data.
    2. The failure to adjust genuine variations would make it impossible to interpolate the trends between two stations where the trends differ. Over large distances trends differ massively.
    3. Time is important. The further back in time one goes, the less temperature stations, therefore the larger the underlying climate signals being homogenized. Also different climate signals decades ago might become relatively more anomalous over time.
    4. The current temperature data has been homogenized a number of times.
    5. As Paul Matthews posted last February, there is instability in the GHCN adjustment algorithm. Re-homogenize the data and there will be new results, quite different from previously.
    6. How does a checker determine if the homogenization results are valid? They cannot refer back to the raw data. It is wrong and has been adjusted many times. Further, the temperature station needs to be representative of the surrounding area to enable a global average to be computed. There is an institutionalised bias towards how the data ought to look. Rather than recognized (with steps taken to compensate) this bias is increasingly enforced. Therefore, there will be a tendency to accept/reject new results on the basis of direction of trend.
    7. Data is both deleted and, in other areas, infilled. Then the data is homogenized again. The manual work involved increases over time, explaining why the number of temperature stations in the GHCN network has decreased since 1990, when computing power to run the algorithms has increased ten thousand fold or more.

    What has this got to do with Wittgenstein? The young Ludwig perceived that our very language is conditioned by our collective perceptions of the world. The climate community is increasingly one that has automatic defence mechanisms to shield against both those with different perceptions of the world, and the facts that contradict the core beliefs.

    *The sea surface temperature measurements until a few years ago were based upon ship readings at non-specific locations. There are far less measurements, but represent double the land area.

    Like

  16. Concerning ship’s sea surface temperature measurements.

    Note that in the decades before the advent of the significant coverage of the oceans by the buoy networks, the ocean temperature data was acquired in the main by ship’s engine room water inlet temperature data.

    Ship’s engine cooling water inlet temperature data is acquired from the engine room cooling inlet temperature gauges by the engineers at their convenience.

    There is no standard for either the location of the inlets with regard especially to depth below the surface, the position in the pipework of the measuring instruments or the time of day the reading is taken.

    The instruments themselves are of industrial quality, their limit of error in °C per DIN EN 13190 is ±2 deg C. for a class 2 instrument or sometimes even ±4 deg. C, as can be seen in the tables here: DS_IN0007_GB_1334.pdf . After installation it is exceptionally unlikely that they are ever checked for calibration.

    It is not clear how such readings can be compared with the readings from buoy instruments specified to a limit of error of tenths or even hundreds of a degree C. or why they are considered to have any value whatsoever for the purposes to which they are put, which is to produce historic trends apparently precise to 0.001 deg. C upon which spending of literally trillions of £/$/whatever are decided.

    But hey, this is climate “science” we’re discussing so why would a little thing like that matter?

    http://www.nature.com/climate/2008/0809/full/453601a.html

    Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.