Here's the recent temperature trajectory. Image via RealClimate
There are two things to look at. One is the astonishing temperature spike of the past two years, which ahs no obvious precedent, and is quite likely an ecceleration of some kind.
The second, and the linked article has a lot to say about this, is whether we have already surpassed the 1.5 C warming threshhold that emerged from the Paris Agreement as an "aspiration" for a warming limit.
I think it's worth me making a few comments on the latter. (This is a revision of a series of tweets on BlueSky: https://bsky.app/profile/mtobis.bsky.social/post/3lfkiam33kc2g )
So the first thing is that 1.5 C is a benchmark, determined by communication constraints rather than any known physical or environmental constraints. To be sure, there may be points along the warming scale where impacts get dramatically worse, though that's probably an oversimplification. Even if there were such points, we don't and can't know where they are until we pass them. Rather, the point of having benchmarks into the unknown territory into which we are propelling ourselves is to gain some collective awareness of whether we are succeeding in slowing our advance into the unknown (probably a little) or stopping altogether (not close). It's a way to communicate our status.
When this benchmark was far away, it seemed adequate to have a fairly rough idea of what it meant. As we get close it is ill-advised. Gavin's text reveals something of the original vagueness of the goal.
The people have spoken, and they have collectively agreed that ‘pre-industrial’ can be thought of as the average of 1850 to 1900. There were other candidates – but the influence of IPCC AR6 is too strong to fight against. So, while I’ve been holding on to ‘late 19th Century’ (in practice 1880-1899) as a baseline, I have bowed to the inevitable and started producing anomalies with respect to the earlier baseline. But that raises a problem – how do you produce an anomaly with respect to a baseline that isn’t in your data set?The first sentence implies confirmation of my understanding that at the time of the Paris Agreement the Parties to the COP had not bothered to be precise as to the meaning of "1.5 C' or "2 C" of warming. And much of Gavin's piece works toward making the 1850 to 1900 benchmark precise. It's a sensible pursuit. If we are going to have benchmarks it's better that they not be fuzzy ones. But the fuzziness of the benchmark can be overinterpreted as fuzziness of the trajectory! Admittedly one could make a case that establishing the baseline tells us a little bit about the climate sensitivity (how much forcing leads to a given warming). It's not information-free in that regard. But it's not the best way to address that question or even a good way! It's main import is conventional. As a measure of urgency, our attentions are better dedicated to the much better measured (and concerning) end of the record and what it means than to arguing about the baseline.