The definition of the solar constant is the power collected outside the atmosphere by a unit area that is perpendicular to the light path. In physics, this is sometimes known as flux density and is defined as the ratio of the total power collected by an element to the total area of the element. Occasionally, the solar constant is referred to as Total Solar Irradiance or TSI.

The 'solar constant' itself is not a constant but is roughly 1370 watts/m2. There are to major items that affect this number:

  • Eccentricity of orbit (Sun - Planet distance is variable)
  • Energy output of the Sun

The total energy radiated by the sun has changed (when looked at a large timescale). In the very early years of the sun's life, the sun was radiating less energy than it does today.

The solar constant has been measured by satellites since 1978:

The data from these satellites shows that the average has indeed changed over the past 20 years ranging from 1363.1464 watts/m2 to 1368.0818 watts/m2. The measurements show that that the sun is a slightly variable star with a period on the order of 11 years.

Overall, the solar constant has increased since the creation of the solar system (about 4.7 billion years ago). The initial value was about 70% of what it is now. During the Carboniferous period (about 300 million years ago) it was about 2.5% less than the value today. With these data points, the approximation can be made:

[1 + 0.4(1 - t/4.7)]-1
where 't' is the time in billions of years since the creation of the solar system. Thus, in 4.7 billion years from now, the Sun would be about 67% 'hotter'. As the sun shrinks, it grows hotter. In 1990, the Chinese Academy of Science reported that the Sun's radius had shrunk 410 kilometers between 1715 and 1987 based upon solar eclipse studies. This shrinking would result in a higher solar temperature and increased solar radiation.

At the maximum years in the 11 year solar cycle, the solar constant is about 0.1% greater than the minimum years. The maximum years being defined as the years with a high number of sunspots. Even this slight change can have large impacts upon the climate of Earth.

In the late 17th century there was a period of 70 years which is now known as the Maunder Minimum where there were no sunspots observed on the sun. During this time, Northern Europe experienced the "Little Ice Age" when the fjords of Norway were frozen over and bitter winters that lasted for 50 years. Again in the early 19th century, between 1800 and 1830 there was another decrease in the sunspots and temperatures in Europe and America were at all time lows. 1816 became known as the "The Year without a Summer", during which 1800 people froze to death. It was at this time that Charles Dickens wrote and told of the harsh winters in London. In 1814, a 'frost fair' was held on the Thames river.

On the flip side, there was a period that lasted from 1000 AD to 1400 AD known as the with the peak at 1200 AD known Medieval Optimum. The era of the Vikings was in the early period of this time. The average temperature at this time was higher than it is today - the summers were warm and dry (and spring time was not cold), England was a prime vineyard (and well known for its wines), while Iceland and Greenland were Viking territories. The vineyards and Viking colony in Greenland perished during the Little Ice Age between 1400 and 1600. This information largely comes from tree rings, ice cores and climate descriptions (such as England being a major wine exporter).

Log in or register to write something here or to contact authors.