A brief, 20,000-year history of timekeeping

As clocks got more accurate, we had to redefine the second.

Popular Science
Popular Science

--

Photo by Fabrizio Verrecchia.

By Kelsey Atherton

Over millennia, humankind’s time-tracking has grown increasingly precise. Sundials divided days into hours. Clocks broke hours into quarters and ­minutes, and finally minutes into seconds. As timepieces evolved, so did scientists’ need for ever-more-exact tickers. They developed devices that relied not on Earth’s wobbly ­rotation, but on microscopic atomic movements. At the heart of it all is an ever-advancing appreciation for our smallest temporal unit, the second. Modern systems like GPS and cellphones rely on keeping this interval consistent, which makes defining and refining it, well, of the essence.

18000–8000 BCE

Earthen Calendars. Polly Becker.

A hash-marked bone found in the Semliki Valley in the Democratic Republic of the Congo might be the earliest human attempt to count the days. Ten thousand years later, in what’s now Scotland, humans dug moon-shaped pits to track the lunar cycle.

3500 BCE

--

--