Apparent magnitude is a term used when refering to a star's brightness as observed from the Earth.The lower the magnitude, the brighter the star.

Some sample apparent magnitudes:
Proxima Centauri: +11.0
Alpha Centauri: -0.9
Barnard's Star: +9.5

The convention of apparent magnitude is based on a system devised in ancient Greece by the astronomer Hipparchus (160-127 BC). The Greeks classified stars into six categories based on brightness - 1st magnitude for the brightest stars they could see, 2nd magnitude for the next brightest, and so on until 6th magnitude, which corresponded to the dimmest stars they could see.

The night sky visible to the Greeks was markedly different from that seen by modern astronomers, and by everyday people, in the 21st century. There was virtually no light pollution whatsoever, so the sky from their point of view would have been teeming with stars. Unless you're reading this in the desert somewhere, the sky you see every night is incomparable to what you would see if you went out some distance into the country, and I'm told that the night sky in really isolated places, like the Himalayas, is something different again. Modern light pollution, depending on exactly where you are, blocks out stars down to between 3rd and 5th magnitude. This means that nearly all the stars visible in a perfect night sky are off-limits, so seeing a decent sky at least once in a lifetime is a good ambition to have.

On the other hand, modern technology means that, if you bother to visit an observatory, you can see stars far, far dimmer than anything visible to the ancients. The apparent magnitude scale can now be extended to about +25: considering that the dimmest star Hipparchus could see was +6, we are talking very dim stars indeed.

The ancient Greeks could not quantitatively measure the intensity of the light they saw; they could only use their subjective perception. However the human visual system does not perceive brightness linearly, but logarithmically. What this means is that something perceived to be half as bright will actually be a great deal less than half. These days we can measure the actual brightness of the stars we see, and it turns out that each increase on the magnitude scale corresponds to a decrease in brightness by a factor of about 2.5. In other words, magnitude 6 stars are 100 times dimmer than magnitude 1. This underlines the extraordinary dimness of the +c.20 stars we can now see.

This fact gives us the general equation for the apparent magnitude, m, of a star: m = -2.5 × log10 I, where I is the actual intensity of the light received.

The apparent magnitude of a star is also a subjective impression of its true brightness in the sense that we are on Earth, an arbitrary point in the Universe. Intensity of light is affected not only by brightness but by distance. Intensity of light decreases proportionally to the square of the distance - this is the inverse square law.

The Sun, for example, has an apparent magnitude of -27, but this is because it is very close, not because it is very bright. The absolute magnitude, M, of a star is the apparent magnitude it would have to an observer at a distance of 10 parsecs, or about 33 light years. Only this is a measure of the star's true brightness.


Some apparent magnitudes are listed below:

Sun: -27
Venus: -4.5 (at brightest)
Sirius (brightest star): -0.5*
Alpha centauri (nearest star): -0.01
binocular limit: +10
large telescope (visual limit): +20
large telescope (photographic limit): +25


* - no, I can't explain this either.

Sources:
Steve Adams and Jonathan Allday, Advanced Physics, 2000
Bryan Milner, Cosmology (OCR), 2000
http://tesla.phys.unm.edu/a111labs/cepheids/mags.html

Log in or register to write something here or to contact authors.