I grew up in the 1980s, in Germany. Global warming caused by the burning of fossil fuels was getting attention in the press for the first time in a big way. The reporting was quite lurid — the announced “Klimakatastrophe” certainly was an attention-grabber — but the underlying scientific argument turned out to be simple enough for a teenager to grasp: The temperature of the earth surface is higher than it would be without the presence of certain components of the atmosphere, which cause the greenhouse effect (explanation skipped for the purposes of this post). The most important of these gasses is carbon dioxide. Burning carbon compounds that were buried underground a long time ago (many millions of years), humans add to the carbon dioxide that is naturally present, thereby increasing the greenhouse effect. By enough that the temperature on the earth surface should rise, on the average? Yes.
The next question that an interested mind would ask was “Can we see this temperature rise in measurements?” And back then, after some explanation regarding the difficulty to measure such a thing as a global average temperature, the answer was “Not yet: when we plot the curve, it trends upwards, but is still within the error bars. Come back in a few years.” (Error bars! Cool, I had just learnt about how to handle experimental error and uncertainty in maths and physics class!)
When in the early-mid 90s, I was working in a university lab, some of my friends worked in environmental physics, and their lab was right on the same floor. So I could ask them: what did they personally think? And the answer was unambiguous: Yeah, it’s going up. We expect it to go up theoretically, and it’s experimentally doing just that. They also warned that things would be a lot more messy than just a general warming at each location. We should prepare for more extreme weather — maybe some locations would get wetter or drier or even colder. It didn’t take long for these messages to move from my trusted scientific friends to official reports.
But global warming wasn’t the only or even the primary story about human activities harming the environment in pervasive and important ways that left an impression on my teenage mind. Not even close. Off the top of my head:
- The ozone hole. Stratospheric ozone depletion over Antarctica was reported by a team of scientists from the British Antarctic Survey, in 1983.
- Widespread damage to evergreen forests, most noticeably downwind from coal mining regions, dominated the environmental news in 1983 (and for a few years after that) under the keyword “Waldsterben” — the dying of the forests.
- The Chernobyl nuclear incident happened on 26 April 1986. (I surprised my American/Canadian partner the other day by remembering the date.) It is hard to find a good single overview article on the web other than Wikipedia. For me and my peers this day marked the end of mushroom hunting and wild berry harvesting, and for weeks parents of my school friends checked out our school with Geiger counters.
- The river Rhine, whose ecosystem was already known as severely damaged by industrial pollution, underwent multiple toxic spills, most prominently the release of large quantities of chemicals after a fire at the Sandoz agrochemical storage plant in November 1986.
I think it is the lessons I drew from how these and other events were handled and played out, then and over the years, that influence my attitudes now. A recent (and excellent) episode of the science podcast published by the journal Nature looks back at the discovery of the ozone hole. One of the original authors of that first paper can be heard thus: “All of a sudden you look at it differently: Wow, we really can affect the planet as a whole.”
This statement on its own very much captures the main lesson, and it was new at the time. [The final part 3 is scheduled for May 21.]