Ben Matthews

  • New here on lemmy, will add more info later …
  • Also on mdon: @benjhm@scicomm.xyz
  • Try my interactive climate / futures model: SWIM
  • 0 Posts
  • 102 Comments
Joined 1 year ago
cake
Cake day: September 15th, 2023

help-circle
  • I’m not convinced this plot bears much relation to the Hansen et al paper. The plot combines SSPs - socioeconomic scenarios, RCPs - levels of ‘radiative forcing’ (all gases together - the numbers 1.9 to 8.5 are RF in W/m2) and temperature.
    Hansen et al (OP) are suggesting that the climate sensitivity may be higher than the IPCC range, which would imply a higher temperature for any given RCP level. This suggestion is derived from the recent history of aerosols and temperature, as I discuss in another comment (above/below?).
    However this doesn’t change which curve we are on (in reality none of them), as the big difference between curves is from projections of future emissions depending on technology, population, policies etc. Also the near-term projections have little relation to the long term, due to inertia but also the range of different models and assumptions used to make them (some years ago). So you can’t look at very recent data to derive which curve we are on.

    I was (years ago) part of the process that designed the IPCC scenario structure of splitting the SSPs from RCPs, to allow the two modelling groups to work in parallel rather than one waiting for the other, so they could get on with connecting feedbacks (not sure this happened…).
    But IPCC ends up re-blending too many varying factors in one plot, because they have page limits, which can be somewhat confusing. Work of 10,000 scientists over six years condensed into one summary report… - is that the most effective method of communication ? That’s why I prefer to make an interactive model.

    Fundamentally, the useful question is not ‘where are we going’ but ‘where do we want to go’ - taking into account all the feedbacks and inertia in the systems, but making choices about our common future. To get to the lowest (1.9) curve we’d need a global green revolution tomorrow, but it’s not physically impossible. As for the 8.5 curve it’s very unlikely it’s part of the set as similar to earlier high-end scenarios run by IPCC since 1990s, which might have happened if we’d stayed in the coal age (still useful for comparing the high end of the physical models). The others are still in play. Extrapolating current trends and policies (especially tech and popn in China) I’d say the most likely is somewhere between 2.6 and 4.5 curves, but we could do better. While if you are building sea-walls you should anticipate worse. But don’t be fatalistic, we still have many options.

    [ Side remark in response to comment - as for Jan 2025, it’s really not ‘climate’ to conclude anything from one month. Even the El-Niño cycle depends on heat transfer between surface and deep ocean, which is a slow process, you need to integrate over much longer time. There may be a warm anomaly recently in tropical oceans, and also the high-arctic, but it’s actually quite cold where most northern people live (due to Rossby waves whose amplitude is increasing, as we expected in an hand-waving kind of way). ]


  • My model, no not yet clathrates or specific permafrost feedback, although I’m concerned about these and would like to put such feedbacks in (with wide adjustable parameter ranges to reflect high uncertainty). My model is interactive, you can play with it in a browser, so it’s hardly typical.
    However, in relation to the OP and the Hansen paper, it’s important to understand that the usual definition of climate sensitivity does not include such “slow” feedbacks - it’s assumed to assume fast atmospheric feedbacks e.g. physics of clouds, but not slower biogeochemistry. CS has been used for decades for comparing models, so it’s useful that the definition remains the same, simply the equilibrium (multi-century) response to CO2 doubling, it’s not any kind of prediction. That’s why it’s surprising that he would draw strong conclusions from a number (4.5) that’s well beyond the normal range.
    Complex 3d physics models derive CS, while integrated assessment models use CS as a calibration parameter for one component of a complex system, including socioeconomic drivers, emissions policies, land use change, etc. Most models (including mine) do have some climate - biogeochemistry feedbacks (for example, there is faster soil respiration at higher temperature), which are included in such ‘real’ scenario projections, but wouldn’t change the CS.


  • I think the Guardian article may be somewhat exaggerating what the Hansen et al paper says. I’ve been studying this kind of problem for 30 years. It’s indeed true, that there are many ways to explain the historical temperature rise, by adjusting the balance of positive (mainly greenhouse gases) and negative (sulphate and white-carbon aerosols, volcanos etc.) forcings. So if you think the aerosol effect was greater, you also have to assume the greenhouse warming was greater to balance - hence deriving a higher climate sensitivity (CS). In this case, they are arguing that we underestimated the (former) cooling effect of shipping sulphate - I’d agree this is not a trivial factor (and similarly for the warming effect of aviation induced cirrus, which we could also change quickly with global transport policy). However, I really doubt this change is sufficient to justify such a big shift from the long-developed consensus range of CS.
    In general, the recent historical data series has never been a sufficient constraint on CS (I know from having tried a similar approach for probabilistic analysis with earlier versions of my own model). So we have to use other methods too, and the IPCC consensus for the likely range of 2.5-4ºC for CS is derived from a wide range of methods and sources, particularly but not only big physics-based models (GCMs). I’d be very cautious about overturning this based on any one study, despite my respect for Hansen and colleagues. Of course, this contributes an interesting new view on this important topic, but it does not justify the headline of the article.


  • Fine map, good to see the old names. But some of these routes are pretty impassable even today - for example I doubt the Wakhan corridor was ever a major route, even the bottom of that narrow valley rises above 4000m. And note Torugart pass (been there…) is north of Kashgar on the way to Issyk Kul (missing lake), not on the way to Osh. So, considering the mountains, I guess a larger fraction than indicated crossed the steppe further north - horses wouldn’t need roads or cities, but it’s easier.







  • Indeed trade links relevant, so navigable rivers played a big role - before railways, our main transport was either boats or horses (or camels). Horses needed a lot of grass, which thrives in drier mid-continental climates where trees don’t survive wildfires. For example the Mongol empire was good at trade and connecting cultures, covered a huge area, but not (for long) near coasts, and still demanded intense tribal loyalty (elements of such culture was absorbed by the next empire which gradually pushed it back…).






  • As a kid, I learned to write i = i +1, before school maths taught me it can’t. The point is, computers do iteration well, especially to model dynamics of real non-linear systems, while classical maths is good at finding algebraic solutions to equilibria - typically more theoretical than real. Calculus is great for understanding repeatable dynamics - such as waves in physics, also integrating over some distributions. But even without knowing that well you could still approximate stuff numerically with simple loops, test it, and if an inner-loop turns out to be time-critical or accuracy-critical (most are not), ask a mathematical colleague to rethink it - believe in iteration rather than perfect solutions.