One of the most important concepts in climate science is natural climate variability. In the very briefest of terms, that means complex natural processes (on or outside earth) that can change the planet’s climate, sometimes drastically, over time. And it’s been a long-standing source of bother to scientists, who’ve spent decades trying to demonstrate that the changes going on now are caused by humans and not natural outbursts.
It’s because of variability on earth that the climate constantly changes by itself, regardless of man, over long intervals of time. These intervals can last even hundreds of millions of years and are known as ice ages. They can be divided into alternating glacial and interglacial periods, or cold and warm climate phases. The last cold phase that stands out in the history of the earth’s climate, though not a true ice age, is known as the Little Ice Age. It was a period of cooling much shorter than a glacial period, lasting from the mid 14th century to the mid 19th, the dawn of the Second Industrial Revolution.
In climatological terms, the Little Ice Age was a phase within the interglacial period that we still live in.