, 44 min read
Science and Technology links (July 25th 2020)
22 thoughts on “Science and Technology links (July 25th 2020)”
, 44 min read
22 thoughts on “Science and Technology links (July 25th 2020)”
nevertheless, the Hansen paper makes my point. Look at the broad range for 2100… it goes from “nearly no impact” to “we are all dead”.
It definitely agrees with your broader point that the uncertainty of our models becomes very large far in the future. I agree with that point too. But I don’t think this paper (or especially Hausfather et al) agree with your narrower point that past climate models made overconfident predictions that were systematically too high.
But more fundamentally, let’s suppose that the models accurately reflect our best state of knowledge and are ‘well calibrated’ in the sense that their X% confidence intervals contain the true value X% of the time. In that case, it seems to me that the wide range of uncertainty – “nearly no impact to we are all dead” – is a reason to take climate change more seriously, not less. This is based on thinking in expected value terms (in the statistical sense). If we are highly confident that climate change will have a moderate impact, perhaps we should only be moderately worried. But if due to greater uncertainty there is a 40% chance of no impact, a 40% chance of moderate impact, and a 20% chance of catastrophic impact, then even if the median impact has fallen somewhat we should potentially be much more worried. This is because a catastrophic impact may be many times worse than a moderate impact, so the expected value can be dominated by the tails. In this scenario uncertainty makes things worse in expectation.
I agree that the failure of past catastrophes to materialize is somewhat reassuring. But only somewhat – if there have been about 5-10 of these past catastrophes that people have been similarly worried about, that only seems to warrant something like 80-90% confidence that this one is wrong too. If what we are worried about is tail risk, that isn’t enough.
I’m also curious whether you have thoughts on whether people are systematically biased toward expecting negative possibilities. It seems plausible to me, but I feel uncertain. And do you think ‘negative’ extrapolations have a worse track record than positive ones? For example, thinking of Moore’s law as a positive extrapolation.
But I don’t think this paper (or especially Hausfather et al) agree
with your narrower point that past climate models made overconfident
predictions that were systematically too high.
This would require an extensive discussion and you are not playing the game as I would invite you to do so. You did not go back in the past, take a specific prediction and compare it with what actually happens. You refer to a rather vague Hansen paper which, as far as I can say, made no specific prediction for 2020. Then you cite a paper that says that if you account for the new information and other unpredictable effects, the corrected old models were correct. Well. I would agree with this last statement in general. If I am allowed to use old models, correct their assumptions and so forth, then I can probably make them fit reality.
But that is not how you test your predictions. You engrave them in rock and leave them alone and then you look back at them, as they are, years later.
Let us play the game now if you want. Let us bet 100$ on your prediction global anomaly in 2030 based on satellite data. You must be correct by more than 0.1C. You alluded to a confidence in the high end, so maybe you will predict 0.45 C of warming ? I am making it very easy for you… not 30 years, just 10 years. But, importantly, you cannot adjust the prediction in 10 years. It must be set in stone. Also, you must be consistent. If you answer 0.1 C or 0.2 C, you are effectively betting on a rather inconsequential warming.
In that case, it seems to me that the wide range of uncertainty – “nearly no impact to we are all dead” – is a reason to take climate change more seriously, not less.
Nobody is arguing for not taking climate change seriously. But probabilities and likelihoods matter. We have an estimate regarding the economic cost of climate change at 5% in the upper end by 2100. This is nothing close to a civilization collapse. However, we could get all wiped out by an asteroid. We are investing nearly nothing in preventing asteroid incidents, and we are certainly not in a hurry to colonize space.
As for doing “something”… you never state what you advocate doing. Let me explain why it matters.
You may have cancer. I am deadly serious. You may develop a deadly cancer at any moment from this point forward. And then you will die. You can go see you doctor and require that you are tested every week for a wide range of cancers. After all, if there is even a small chance that the cancer is caught early, it is for the best, right?
So why are you not getting tested routinely for various cancers?
Well. I can give you one reason: tests are largely ineffectual. You will find more cancers if you test more, and you will receive more treatments… but you won’t live longer and healthier. We know this because there is extensive documentation on the topic.
Now. You might say “fine, I won’t get tested for cancer routinely”. And then I might reply “what! you don’t think that cancer is a serious issue?”
So be specific, again. What is the something you want us to do.
Many things could kill us. We cannot invest everything we have in preventing any one thing. You have to be efficient.
What do you propose that we do, specifically?
You stated that past models had made overconfident predictions that were systematically too high, and I said that wasn’t my read on it and that I’d be interested in evidence supporting your claim. You kicked it back to me, and I shared what was informing my opinion. I’d still be interested in seeing what’s informing your opinion, but I’m not an expert on this topic and am not interested in trying to argue in detail for a particular position here. I’m not sure what the right position is.
Still, my takeaway from the Hansen 1981 paper is that is is appropriately confident, not overconfident, and that the projection it makes is too low (though within uncertainty), not too high. This seems to support my view, though I don’t want to draw too much from one paper.
I’ll elaborate. First, this paper acknowledges that future emissions are uncertain, as are various aspects of climate modeling (e.g. ocean heat capacity). It deals with this by laying out projections under a number of scenarios, and explaining the uncertainties caused by the various factors it discusses. This is also consistent with my perspective that past climate models on the whole were not overconfident.
This paper does nevertheless make projections, and these do not overestimate the warming which has occurred between 1981 and 2020. Rather, these projections are accurate within the stated uncertainty, which is a factor of 2. If we plug in actual emissions data, then the point estimates are right on. If instead we use the paper’s emissions scenarios as inputs, then the point estimates from the paper are a bit too low (though again, still well within the stated uncertainty).
On the first point, the paper’s best-guess climate sensitivity of 2.8C would predict warming of 0.78C between 1981 and 2020 given actual emissions, with a range of uncertainty between 0.4C-1.6C. The actual warming was between 0.77C and 0.86C, so this is right on.
If instead you use the guesses about emissions from the paper, then the paper predicts warming that was too low. The math gets a little more complicated than I feel like checking myself because the paper assumed non-constant emissions, but according to a website I cited above, the ‘high’ and ‘low’ emissions scenarios the paper discusses correspond to warmings of 0.28C-1.1C and 0.36C-1.44C between 1981 and 2020. The midpoints of these estimates are a bit below observed warming, but observed warming falls within uncertainty.
So what I see is an appropriately confident prediction that was too low, not an overconfident prediction that was too high. (Though again, I don’t want to conclude too much from one paper.)
Your summary of the other paper also does not seem accurate to me. That paper finds that old models do pretty well even if you don’t change their projections at all. It’s just that they do even better if you plug in observed emissions data. Note that this also supports my main thesis about climate modeling throughout this thread, which is that although emissions seem hard to predict, like smartphones, atmospheric warming in response to emissions seems relatively easier to predict (though still hard), more like asteroids. I agree with you that evaluating this paper would be onerous, but I shared it because it’s part of what’s informing my opinion here. I’m ‘leaving this one to the doctors’ to some degree.
you never state what you advocate doing. Let me explain why it matters.
I agree with you on this – nowhere in this thread am I advocating for any particular actions, or even saying that we should take action at all. This is because it’s a complicated topic I feel uncertain about. Instead, in this thread I’m primarily interested in things like how confident we can be in forecasts about the future, how this might differ from topic to topic, how much we should take the future into account in our decision making even if it is highly uncertain, and how these general questions should be informed by some of the more specific points you’ve made like past panics that haven’t panned out, the performance of past climate models, connection between size and time scales, how long the idea has been around, etc.
You might say “fine, I won’t get tested for cancer routinely”. And then I might reply “what! you don’t think that cancer is a serious issue?”
I agree with this too – just because something is important doesn’t mean that any particular remedy is a good idea. Like I said in a previous comment, I think the point I’m making about uncertainty (which I think you’re replying to here) is consistent with your position, but important to keep in mind. And my point isn’t that we should do anything specific, but rather that even very uncertain projections about the future can be decision relevant. Indeed, as I argue, there are situations in which more uncertainty means that the future plays a larger role in our decisions in expected value terms, not a smaller role. (Though if you’re right that the worst-case outcome for climate change is only a 5% hit to GDP, then climate change may not be such a situation, or at least the size of this counterintuitive-to-me effect that ‘more uncertain = more important’ might be small in this case. But I think the worst-case is worse than this – more in that in a sec.)
We have an estimate regarding the economic cost of climate change at 5% in the upper end by 2100. […] However, we could get all wiped out by an asteroid. We are investing nearly nothing in preventing asteroid incidents, and we are certainly not in a hurry to colonize space.
Toby Ord, in his excellent recent book The Precipice, looks carefully at the probability of various existential catastrophes over the next century. (By the way, I strongly recommend his book! It’s excellent, and I think you’d really like it. If you don’t want to read the whole book, I’d recommend his interview on the 80,000 Hours podcast. It’s episode #72.) Ord estimates the risk of existential catastrophe via comet or asteroid as ~1 in 1,000,000 per century, plus or minus an order of magnitude or so. This is certainly enough to matter, since it could wipe out our whole future. We should do more about it!
He estimates the same risk from climate change at ~1 in 1,000 over the next century, but expects that this number could go down significantly with more research on runaway greenhouse effects – right now a lot of the case that this kind of thing can’t happen rests on only a couple of papers. It’s a case where uncertainty is driving the expected value higher.
I’m not saying these numbers are ‘right’ or couldn’t be disagreed with, and of course our estimates will change a lot as we learn more. But let’s just take these numbers for the sake of argument. In that case, it seems to me that we should be pretty concerned about the tail risks from climate change. So I guess if I had to make a climate change recommendation, it would be let’s do more research on tail risks.
We cannot invest everything we have in preventing any one thing. You have to be efficient.
Couldn’t agree more! I think climate change is probably more important than the median issue society focuses on, but not the most important issue. I also agree that many proposed remedies seem too expensive to me, though I feel uncertain about this.
You do not refer to Hansen’s predictions from his 1981 paper regarding 2020. You are inferring what he might have predicted while using your current knowledge.
Hansen makes specific predictions in a 1988 paper about the next 20 or 25 years. Within that range, under the business-as-usual scenario, they predict a 0.5C warming per decade. There is no need to reengineer his predictions, they are clearly slated: 1C in the next 20 or 25 years (I paraphrase, but it is that clear in the paper). Of course, we have never observed 0.5C or 0.4C or even 0.3C warming per decade. The prediction is flat out wrong.
To be clear, the Earth is warming. But the evidence for a 0.5C warming per decade or for a 4.5 C warming during the XXIst century is very thin. And, to be clear, though a 4.5C warming would not be good, the evidence that it would be an extinction event is nil. Human beings would adapt, we have credible economic models. The fauna would be the big problem. What is much more likely is a modest amount of warming (say 2C in a century). Such a modest amount of warming has some good and bad side-effects. They balance out to a modest effect on our economy.
Anyone who says, as you suggest someone does, that a climate extinction event has a 1 out 1000 likelihood… has little credibility in my eyes.
That does not mean that we should not guard against extreme climate change. But you have to have working and efficient strategies. Holding yearly meetings where 10,000 government officials show up in planes each year… is not it.
But the evidence for a 0.5C warming per decade or for a 4.5 C warming during the XXIst century is very thin. And […] the evidence that it would be an extinction event is nil. Human beings would adapt, we have credible economic models.
Ironically, it seems to me that you’re being overconfident here. I agree with you about the most likely outcomes, but think you’re too confident that the tail risks won’t occur.
Anyone who says, as you suggest someone does, that a climate extinction event has a 1 out 1000 likelihood… has little credibility in my eyes.
This also strikes me as overconfident. You are saying that you are more than 99.9% sure that a particular outcome won’t come to pass. And not only that, but that someone who is less certain than you must not be credible.
I agreed with most of the rest of your comment.
Ironically, it seems to me that you’re being overconfident here. I agree with you about the most likely outcomes, but think you’re too confident that the tail risks won’t occur.
Maybe we will all be dead in five years due to a crazy virus or a solar flare, maybe the COVID-19 virus will cause terrible mutations and we will all die, maybe plastics in our environment are causing widespread hormonal changes and all women will become sterile, maybe the government is in touch with extraterrestrial beings.
All these things may be true. But there is no evidence for any of it. Saying that there is no evidence is not the same as stating that it is untrue.
This also strikes me as overconfident. You are saying that you are more than 99.9% sure that a particular outcome won’t come to pass. And not only that, but that someone who is less certain than you must not be credible.
I do not have to prove as wrong someone’s estimate of the likelihood that we are visited by extraterrestrials or that climate change will cause an extinction event by 2100. Many people are convinced that 5G technology will cause terrible cancers. I know people who are convinced that we have been visited by extraterrestrials. I do not have to disprove such beliefs.
It does not mean that these beliefs are wrong. Maybe 5G will lead to new cancers. My knowledge is limited.
What is the likelihood that the COVID 19 vaccine, 5 G or 4C of warming would be an extinction event? I do not know. But the fact that I do not know does not mean that I find credible people who claim to know.
Anyone who can model the future 80 years ahead of time better have a great track record at predicting the future 2 years, 5 years, 10 years ahead of time.
And these people better have skin in the game.
No, this is wrong. I lay out the predictions from his paper both with and without current knowledge. It’s within uncertainty both ways. You said that past climate models were overconfident (wrong with regard to this paper) and predicted warming that was higher than observed (wrong with regard to this paper).
Can you please provide an exact quote from the original paper?
Hansen had three models (called scenarios), A, B and C. C was the model where we cut drastically emissions (something we did not do). Models A and B predicted, respectively, a +1C warming in 20 and 25 years, respectively.
There was no +1 warming between 1988 and 2008 or 2013.
Model C predicted +0.5C warming, which is quite close to the correct value.
Re 5: The Sea Surface Temperatures appear to complement the phenomenal LR04 Benthic Stack δ18O data. Isotope data from drilled marine sediment cores seems like a great source for scientific exploration.
As a climate economist I must protest re current global warming trends: Only with drastic GHG emissions reduction measures, globally, do typical climate forecasts flatten out at some potentially palatable increases of maybe 2-3 degrees. Estimates under business-as-usual lead to continued warming by 4, 5, 6, and more degrees, essentially straight up as time continues, along with carbon concentration and cumulative emissions. This seems near absolute consensus among climate scientists (and economists), even if there is large uncertainty not least due to climate sensitivity (degree warming per carbon increase). It is not even clear when even just emission rates could fall without drastic measures in the near future.
Sidenote admittedly without reading the quoted paper in full: Theories linking rise and fall of empires abound, that a 1-2 deg temperature change in its own right would have had a decisive (and maybe generalizable to today, that’s why its interesting at all, right?) influence for the fate of the Roman empire, does sound simply very unlikely to me – well and else it actually should probably mean any type of climate temperature change should generally worry us more than it does, independent of the exact level details.
Here is a recent international assessment on this issue: a doubling of CO2 would (with high probability) lead to a 1.5‐4.5°C rise in temperature.
If you think that more than a doubling of CO2 is likely in our lifetime, please provide a credible reference.
Not sure our own lifetime should be center stage. As I wrote, in business-as-usual, I just have never heard any evidence for the warming increase (and carbon concentration indeed) to magically stop without further ado, and to do so (with some reasonable certainty) below 3 degrees, as your original statement seems to suggest. I’d be keen to understand whether you can provide evidence for this – if true very relieving – view? Carbon concentration is now near 420 ppm, up from the pre-industrial 280 ppm, i.e. half-way through towards doubling. Annual carbon emission rates have multiplied from 6 bn ton CO2/y in 1950 to some 37 bn ton in 2019 and seem to be continuing to rise, maybe there’s some hope they stop rising (thanks to considerable efforts), and maybe there’s hope concentrations stop doing what they did so far, rise convexely over the past 50 years.
Not sure our own lifetime should be center stage.
It is increasingly difficult to make predictions further in the future… 50 years, 100 years… 150 years…
But it is a bit irrelevant. Three degrees is a lot. If you want to discuss what happens with 7 degrees…. well, that’s another discussion.
“It is increasingly difficult to make predictions further in the future…”
Sure. For example, maybe we won’t stick to ‘business as usual’. But this seems to just get back to Florian’s initial point, which after this back and forth seems correct – it’s only drastic GHG reductions that keep us below 2-3 degrees in the long run according to typical climate models. Your initial statement seems to be intended to describe only a short time window, without saying so.
But even then, we are framing the discussion in terms of point estimates. This seems problematic due to the large uncertainties in these estimates. For example, it is my understanding that since climate sensitivity may vary by a factor of 3x or so, even within our lifetimes we can’t be confident in a temperature increase of less than 3 degrees. Climate sensitivity could be higher, so we could instead get warming of closer to 4.5 degrees within my lifetime (noting that I’m pretty young). Of course, it could also be lower.
This choice of framing of climate predictions in terms of point estimates rather than intervals seems to be dominant in lay presentations. I’m sure researchers aren’t confused, but this communication choice has often been confusing for me as a layperson, leading me to think that the tails of the probability distributions from climate models were narrower than they in fact seem to be.
Your initial statement seems to be intended to describe only a short time window, without saying so.
I submit to you Tyle that predictions by extrapolation decades in the future have a terrible track record. The Club of Rome predicted in the 1970s massive starvations by the turn of the century. Instead, we could an obesity epidemic. Preparing in the 1970s for a future full of starvation was not only silly, but it has been very harmful. The Chinese one-child policy has done tremendous harm for naught.
To put it in clear terms, it is almost surely irrational to base our decisions today on our ability to extrapolate decades in the future. But if you must do it, then do it with much care, as Nobel prize recipient William Nordhaus did. If you apply his DICE model, what you find is that the “drastic GHG reductions” scenario can be much worse than “doing nothing”. Maybe Nordhaus is wrong, but I would warn you against reasoning about exponential curves far out in the future and deriving policies in response. The one man who got the Nobel prize for it does not favour the “drastic GHG reductions” scenario.
Good intentions are rarely sufficient. Often, policies applied with good intentions have terrible side effects. Without a deep understanding of all the relevant factors, it is hard to predict the effect of a policy just a few years out… in fact, just predicting the effects months in the future can be difficult.
Our intuitions are often very wrong. Most people believe, for example, that we pollute more than we did. Yet it is largely the opposite. CO2 emissions per capita are down (at least in the US). Not only that, but total CO2 emissions are down. And they fell more under the Trump presidency than under any other administration. What is happening is that we have another force, a force that tends to trump (pun intended) most other factors: human innovation.
This being said, you state that you are young. May I suggest an experiment? Publish, in a way that cannot be easily erased, your views about the future… 10 years in the future… 20 years in the future… Will there be more trees, more poverty, more pollution… plan it out all out… then revisit your predictions when you are older… Make sure that all your views are stated in a falsifiable manner.
That is one good way to test out your ability, or our ability, to predict the future. If you do so, you may gain, over time, an appreciation for what I mean when I say it is hard to predict the future.
Thanks Daniel, lots for me to think about here!
It still seems to me like we can extrapolate into the future better in some cases than others. For example, we can predict the trajectories of celestial bodies far into the future with high confidence, but can’t predict the results of most policy changes with high confidence at all. And it seems to me that in some ways, the question of warming as a function of emissions is closer to the asteroid example.
That said, I take your point that perhaps ‘business as usual’ is an unlikely assumption due to innovation. So even if we can know something about warming as function of CO2 concentrations in the future, the concentrations are themselves quite uncertain. And I take your point that the connection to policy questions is more uncertain still.
Perhaps you’re right to focus on the policy question (which I didn’t intend my comment to address). And thank you for mentioning the DICE model – I wasn’t aware of it, and will read up.
Thank you for your proposed experiment! I will make an effort to take you up on it. I have tried to make predictions sometimes about smaller questions like “how many books will I read next year”, and even there I don’t do great.
And I still maintain that people should be talking about intervals rather than point estimates! 🙂
For a two-body problem, it is indeed rather easy. The three-body problem is remarkably hard: https://en.wikipedia.org/wiki/Three-body_problem
We can predict the behaviour of the solar system for millions of years in the future, but even assuming that there was no other influence and the bodies themselves remained as they are, we certainly could not predict it for billions of years in the future.
Of course, millions of years is a lot but you have to consider that you are working at a scale that is millions of times larger than anthroposphere.
There are many long-lived, time-tested concepts. Ancient religions and cultural elements capture many of those lessons. The Roman Empire used chairs and knives. We will probably find the equivalent to chairs and knives in the far future. However, any idea that is only a few decades old should not be expected to survive many more than a few decades.
When I was 20 years old, there was no such thing as climate change on the horizon. However, we had acid rains. It was damn scary. I read many books about it. It was burning down our forests and wiping the life out of our lakes.
Then we had peak oil to worry about… by the turn of the century, we were going to run out of oil. It was all mathematically true. It was easy. Just look at the oil reserves and how much we used. Just run a model. We were going to run out soon, unavoidably.
You did not consider the Club of Rome predictions, but they all seemed undeniable too. You just plotted the population growth, and then you look at the maximal food output given our available land, it was a mathematical certainty that millions of us would starve.
I am skeptical that we are able to reliably predict the trajectories of asteroids in general (e.g., in the asteroid belt).
Regarding your precise statement… There is an easy way to test it out. Let us look at the predictions that people were making say in 1990 about 2020. Do you think that they worked out? After all, it is only 30 years? No. They did not. The actual warming we experienced was below the error bars set in 1990. We are not talking about point errors, the intervals were wrong.
Now, if we could not predict 2020 from 1990, it seems quite certain that from 2020, we cannot predict 2050, let alone 2100.
This is very hard. The problem is that, to realize it, you have to go back in time and see what you and other people were stating as facts years ago.
People have selective memories regarding when they have been wrong, so they greatly overestimate their ability to predict the future.
For fun, try reading science-fiction from the 1980s or 1990s. Or just watch old Star Trek. Notice how they have nothing like the Internet. They have nothing like our smartphones. They try to predict what the human race will be hundreds of years in the future and they can’t imagine something as powerful as a smartphone. And these are science-fiction writers.
How many people accurately predicted the 2020 pandemic? I mean, how many people have invested in N95 stocks and the like back in 2019?
So it is just not reasonable to work from models that are looking 100 years in the future because these models are almost surely badly wrong.
That does not mean that you cannot prepare for the future. You can. Being stronger and more robust is desirable… but that’s a dynamic process not a projective one.
Thanks for another thought-provoking response.
Interesting point! I’d be curious to hear if you have further thoughts on how scale relates to time. For example, it occurs to me that if we just assume a proportional relationship between time and linear scale, then predicting what earth’s atmosphere does over the next century (r=10^7 m, t=10^6 hours) might be analogous to predicting what a small bacterial colony does over the next half second (r=10^-3 m, t=10^-4 hours). I don’t really mean this as a critique, more as an exploratory observation on an interesting new idea.
Really interesting point, thanks! This does seem like a good way to set our priors. It seems to me that in some cases we have enough evidence to shift those priors substantially, though. One example would be when we discover features of the natural world, like electrons or the structure and function of DNA. Even shortly after such concepts are discovered and gain empirical support, I think we may rightly be able to conclude that they will probably last much longer than they’ve been around. The conceptual breakthroughs associated with relativity and quantum mechanics may also belong in this category – I think it wasn’t many decades after their introduction that we could be reasonably confident these ideas would be around for many centuries (even if superseded by a more complete picture), conditional on the survival of humanity.
Another example would be fundamental conceptual insights. For example, the concept of prime numbers was bound to last. Similarly for say Shannon’s definition of entropy. And another example would be certain types of new technologies. Following the invention of say writing, or computing technology, perhaps people could have correctly reasoned that although this was a recent innovation, it was likely around to stay in some form.
I’m more interested in the general case than in climate change specifically, but – perhaps one could make the case that the discovery of climate change is analogous to the discovery of a new feature of our natural world, so we shouldn’t be shocked if the idea is around for a while even if it’s relatively new. And actually, I just learned that the idea has been around since the 1890s due to Arrhenius. Though of course concern about it hasn’t become widespread until recently.
I think this is right, but of course some things might be easier to predict than others. My view is that behavior, policy, and innovation with respect to climate change are hard to predict like smartphones, but that temperature increases as a function of atmospheric composition are easier to predict (though still damn hard), more like asteroids.
My casual read is that past climate models, say starting in the 80s or 90s, were mostly right, in the sense that the true warming falls between the low and the high bounds for most of them. That was the case for a couple of lists of past climate models that I looked through. Perhaps these lists were cherry-picked? I’d be open to a better resource. Though as above, I’m more interested in the general points you’re raising than the specific case of climate change.
But I do agree that at some point the uncertainty is such that the predictions no longer say much, and maybe 100 years is pushing it in terms of the models being informative. I’m no climate expert so I don’t know where that line is, but you’re making a good point that I was probably under-weighting before.
Let’s say we have five examples of past catastrophes that people were worried about that never came to pass. Wouldn’t that by itself only warrant something like 80% confidence that the current worries won’t come to pass? That is something like my view – I think there is at least a 20% chance that climate change will turn out to be pretty bad, and between a 0.01% and 1% chance that it’ll significantly curtail humanity’s long term potential. Does that sound about right to you?
Past worries of catastrophe that got significant attention and didn’t come to pass: acid rain, ozone layer depletion, nuclear war, peak oil, starvation based on resource scarcity relative to population, maybe Y2K… these are the ones I know about, I’d be curious to hear more examples!
Of course, these worries not coming to pass doesn’t necessarily mean that people were wrong to have worried. First, because the lack of disaster could be due to innovations, policy or behavioral changes, etc. that were partly in response to the worry. Maybe if we hadn’t been worried about the ozone layer or starvation, we wouldn’t have restricted CFCs or gotten the green revolution as soon, or whatever, and we could have faced worse outcomes. Second, because even if there were only say a 1-10% chance of a very bad outcome occurring without our intervention, intervention could still be worthwhile.
(It does seem clear that insofar as people thought it was ‘mathematically certain’ that their models were correct and catastrophes would come to pass, they were of course badly wrong. I’d be curious to get a better sense of how widely these models were believed with certainty.)
I have another question for you. Do you think it’s true that ‘negative’ extrapolations have a worse track record than positive ones? I’m thinking of Moore’s law as an example of a positive extrapolation. I can imagine a couple of reasons to think this might be the case. First, when people make predictions, perhaps they underweight the extent to which humans will take actions in the future to make the good ones come true, and the bad ones not. Second, serious responsible people who are less worried about threats might have less incentive to speak up than those who are more worried. This could be individually rational, but lead to a systematic negativity bias at the collective level.
Finally, I’d like to make a distinction between how confident we can be in projections, and what we should focus on. I think there is a case for the long term being the right place to focus sometimes even if the effects of our actions there are less certain. For example with climate change, the number of people potentially affected in the future might be larger (not just due to population growth, but because the effects could last for many generations), and moreover the effects have the potential to be more severe further in future. I think that this is consistent with your point, but important to keep in mind. Even if we don’t have much faith in the models, the far future might still be an important consideration sometimes in expected value terms.
Cite a specific prediction for the 80s regarding warming predictions for 2020.
There is a human bias whereas when faced with a problem, human beings want to do “something”. However, often enough, this “doing something” makes things worse.
That’s why, when you are sick, I recommend delegating to a medical doctor. If you try to treat yourself, you will tend to do “something” at all cost even when the best strategy is to do nothing.
Here are some of the links I was looking at: https://www.carbonbrief.org/analysis-how-well-have-climate-models-projected-global-warming, https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2019GL085378.
From the latter: “We find that climate models published over the past five decades were generally quite accurate in predicting global warming in the years after publication, particularly when accounting for differences between modeled and actual changes in atmospheric CO2 and other climate drivers.”
One specific model from the 80s (since you asked for it, I don’t think we should conclude much from one study) is Hansen et al 1981. This included two scenarios, with warming by 2019 (relative to 1981) of 0.55C and 0.72C. It looks like the actual warming was between 0.75C and 0.84C during that same 38 year period. But like I said, I’m by no means informed on this topic and I’d be interested in other resources if you have a different view.
This seems right. I didn’t understand the relevance to the bit you quoted though, which was about the degree to which we should focus on the future even if it’s uncertain. Is your point that we are too eager to act on climate change?
Regardless, I appreciate the exchange.
The quote you refer to implicitly takes into account the errors due to incorrect estimates. Well. No. I mean, sure, you can predict exactly where asteroids in the asteriod belt are going to be, if you are allowed to retrospectively correct your models with new data.