In the early twentieth century, it was discovered that some chemical elements decay into others at highly stable rates.
In the Western world, the key to the age of the Earth was long assumed to be the Bible and its account of creation. Acting over many millions of years, they could explain the geological record without recourse to the great flood of Noah.
Creation dating required careful accounting of the chronology given in Genesis and then matching it to historical events recorded elsewhere. Charles Lyell popularized the concept of uniformitarianism in the mid-1800s and argued that the Earth had to be very old indeed.
The Sun must be shrinking for this explanation to work.
“How big” is almost always an easier question to answer than “how old.” Though we can measure the sizes of animals and plants easily enough, we can often only guess at their ages. The ancient Greeks Eratosthenes and Aristarchus measured the size of the Earth and Moon, but could not begin to understand how old they were.
With space telescopes, we can now even measure the distances to stars thousands of light-years away using parallax, the same geometric technique proposed by Aristarchus, but no new technology can overcome the fundamental mismatch between the human lifespan and the timescales of the Earth, stars, and universe itself.
More generally, uniformitarianism holds that the physical laws and processes we see today are the key to understanding the past.
This is the idea that, today, enables scientists (including many past and present Members of the Institute) to understand the afterglow of the Big Bang and to see the universe as it was 380,000 years after it formed.
Despite this, we now know the ages of the Earth and the universe to much better than 1 percent, and are beginning to date individual stars. These estimates were not seriously challenged until the emergence of modern geology in the eighteenth century.