The Stefan-Boltzmann law for

blackbody radiation relates the temperature of a

blackbody to the

energy flux at its surface. It is named after

Josef Stefan, who first stated it based on experiment in 1879, and

Ludwig Boltzmann, who derived it from theory five years later. The law can be stated mathematically as:

F = sigma T^{4}

where

F = energy flux at surface (in joules per square metre per second)

T = temperature (in Kelvins)

sigma = Stefan-Boltzmann constant (approximately 5.67 x 10^{-8} W m^{-2} K^{-4})

This law states that if an object's temperature is doubled, the energy emitted increases by a factor of 16. If the temperature increases by a factor of 10, the rate of energy emission increases ten-thousand-fold. This is why an iron bar at 300 K (room temperature) does not give off any noticeable light, but at 3000 K (approximately 2700 degrees Celsius or 5000 degrees Fahrenheit) it glows quite intensely.

This relationship is used in astronomy to calculate the energy flux of distant stars from their surface temperature (which can be obtained independently from Wien's Law).