The convention of absolute magnitude became necessary when it was realised that
stars vary greatly in their distances from Earth.
The brightness of stars was originally measured by their apparent magnitude - how bright they appear to an observer on Earth, which is related logarithmically to the intensity of light received. It was originally believed that all stars were at a fixed distance from our planet, a concept which was central to the cosmology of the ancient Greeks. When it became clear that their distance actually varied greatly, the concept of apparent magnitude could no longer be thought of as a true measure of brightness because the distance had an effect on the intensity of light seen on Earth.
To make a true comparison of the brightness of stars, they would need to be viewed from the same distance. The distance assumed for absolute magnitude is 10 parsecs, which is equal to about 33 light years. Obviously we can't send some guy up there to look at the star from 10 parsecs, but if the distance to the star is known, we can use the inverse square law to calculate its absolute magnitude from its apparent magnitude. The equation used is:
M = m - 5 log10 (d/10)
where M is absolute magnitude, m is apparent magnitude and d is distance from Earth in parsecs.
Because it is related to distance, the difference between M and m is called the distance modulus.
Just in case you get confused (this is me writing after all), here is the distinction between the key terms:
Intensity - the amount of light received from a star at a particular place.
Luminosity - the actual amount of light radiated by a star. Related directly to absolute magnitude, but not apparent magnitude.
Magnitude - the perception of brightness of an object at a particular place. Not the same as intensity, because human perception is logarithmic. Apparent magnitude is taken from Earth; absolute magnitude is taken from 10 parsecs.
Source: Steve Adams and Jonathan Allday, Advanced Physics, 2000