One of the many things to appreciate about our home planet is that buried in its layers of rock is a kind of time machine. These strata tell us so much about our tumultuous history of glaciers, volcanoes and asteroid impacts, as well as the plants and animals that lived, evolved and died over eons.
There’s no doubt that future geologists or archaeologists will find a lot to interest them in the layer being laid down right now — weird materials from plastic to plutonium and dramatic changes in the nature of fossilized plants and animals. And yet recently, a group of scientists rejected a proposal to give our current epoch a new name: the Anthropocene, derived from the Greek word for human.
That’s too bad. It’s a fitting name and scientifically sound idea — and seems to have been thrown out over technicalities.
The approach of dividing deep time into segments began before scientists recognized how old our planet was. Geologists in the late 1700s and early 1800s saw layers of rock with different materials and sometimes very different embedded fossils. They noticed that these sometimes changed at abrupt boundaries. They were starting to consider that the Earth might be millions of years old, rather than thousands, but it wasn’t until the 1950s that researchers established that our planet had been around for 4.5 billion years.
Scientists of the early 1800s started ordering these geologic ages in a nested system — the biggest units were eons, within which were eras, then periods, then epochs.
Our current period is called the Quaternary, and within that are two epochs — the Pleistocene, which started 2.5 million years ago, and is known for periodic ice ages, and the Holocene, which started 11,700 years ago and is known as a relatively stable, mild period that allowed humanity to spread around the globe.
Many of the previous periods are named after geographic locations where rock formations or fossils were identified. The name Anthropocene was proposed in 2000 by chemist Paul Crutzen, who won a Nobel Prize for his part in the discovery that human activities were threatening Earth’s protective ozone layer.
In 2009, a team of scientists known as the Anthropocene Working Group set out to pick a date when the Holocene ended and the Anthropocene began. They eventually settled on 1952, when humanity added plutonium and other detectable by-products of atomic bomb testing to our planet’s surface.
The recency of that date seemed to be a sticking point for the scientists who rejected the Anthropocene concept. Some also argued that what we’re calling the Anthropocene is not so much an epoch as an event — a rapid environmental change that might or might not kick off a new epoch.
Richard Alley, a Penn State University professor, says that since the Anthropocene just got started and represents only a sliver on the top of Earth’s crust, geologists don’t need it for mapping purposes. But if you appreciate that the lines the early geologists drew through different eras represented upheavals, then what’s going on now easily qualifies.
Barring some spectacular technological intervention, the carbon dioxide that’s come from burning fossil fuels could take 100,000 to 500,000 years to be reabsorbed by Earth, Alley said. In the meantime, the resulting glacier loss and sea level rise will affect people for thousands of years. So however long our species lasts, the influence of recent decades will reverberate through time.
Historically, people had trouble believing we could change something as powerful and vast as Earth’s climate. And human beings couldn’t have known, at first, that they were changing the planet’s atmosphere.
Of course, we now know that humanity is leaving a big mark in the strata. What we don’t know is how future scientists will judge us. Naming this era the Anthropocene could be seen as a positive statement about our species — that we had the foresight and self-awareness to recognize our growing impact on our vast but limited Earth.
F.D. Flam is a Bloomberg Opinion columnist covering science.