Greenhouse early-Earth myth

Scientists claim to have solved "the faint Sun paradox", which says that the Earth's temperatures have been relatively constant during the four and a half billion...
31 March 2010

EARTH

"The Blue Marble" is a famous photograph of the Earth taken on December 7, 1972, by the crew of the Apollo 17 spacecraft en route to the Moon at a distance of about 29,000 kilometres (18,000 mi). It shows Africa, Antarctica, and the Arabian Peninsula.

Share

Scientists claim to have solved "the faint Sun paradox", a long-standing scientific mystery highlighted in the 1970s byearth Carl Sagan and George Mullen.

The paradox points out that the Earth's temperatures have been relatively constant during the four and a half billion years that the planet has been in existence, despite the fact that the heat output from the Sun has increased by 30% over this time. The question is, how did the early Earth remain so warm in the face of a fainter Sun? It should have spent at least half its early life frozen solid.

Originally, scientists thought the answer must lie in high levels of a greenhouse gas like ammonia - NH3, the formation of which the low-oxygen environment of the young Earth would have favoured. But then they realised that this breaks down in sunlight, so that couldn't be the answer.

The problem was thought to have been solved when the American scientist, James Kasting, suggested in the early 1990s that CO2 was probably the answer. By acting as a greenhouse gas, and with an atmospheric concentration approaching 30%, he found, this, working together with water vapour, could keep the Earth at a temperature of about 70 degrees celsius.

For a while, everyone was happy with this explanation. But then geochemists unearthed ancient rocks that have shown that the CO2 levels were probably too low to have made this plausible, so the jury was out again. Now, a paper in Nature has produced a plausible explanation.

University of Copenhagen scientist Minik Rosing and his colleagues have analysed compounds of iron found in rocks more than 3.8 billion years old. These so-called banded-iron formations contain two different iron minerals, magnetite and siderite, which form in different ratios according to how much CO2 is around.

Their results indicate that there couldn't have been much more than just three times present-day levels of CO2 in the ancient atmosphere. This is a far cry from the 30% that would be needed if CO2 was the answer.

Instead they suggest that the effect is down to albedo, which is the amount of solar energy reflected off the planet's surface and back into space. On the early Earth, they point out, the continents were much smaller, most of the surface was heat-hungry water, and clouds were made of larger water droplets because there were fewer cloud-forming particles in the atmosphere. These effects meant that far less energy was bounced back into space, keeping the planet warmer.

So why didn't the world warm as the Sun aged and began to produce more heat? Because, the researchers show, the continents grew, life appeared and there were more light-reflective clouds in the sky, which all add up to a balmy, stable temperature.

Why this is also important, say the scientists, is that, contrary to popular belief that CO2 levels in the past have been much higher than they are today, carbon dioxide appears to have remained relatively stable throughout the lifetime of the Earth; this needs to be taken into account in climate models that scientists are attempting to use to predict potential climate change...

Comments

Add a comment