The Magnitude Scale
- Introduction to Astronomy
- The Celestial Sphere - Right Ascension and Declination
- What is Angular Size?
- What is the Milky Way?
- The Magnitude Scale
- Sidereal Time, Civil Time and Solar Time
- Parallax, Distance and Parsecs
- Apparent Magnitude, Absolute Magnitude and Distance
- Variable Stars
- Spectroscopy and Spectrometry
- Redshift and Blueshift
- Spectral Classification of Stars
- Hertzsprung-Russell Diagram
- Kepler's Laws of Planetary Motion
- The Lagrange Points
- What is an Exoplanet?
- Glossary of Astronomy & Photographic Terms
The brightness of an object is a basic observable quantity. It is easy to observe two stars and say that star A is brighter than star B, but it would be handy if we had a way of quantifying this brightness so we can say that star A is x times as bright as star B. To this end the Magnitude Scale was introduced.
The Greek mathematician Hipparchus is widely credited for the origin of the magnitude scale, but it was Ptolemy who popularised it and brought it to the mainstream.
In his original scale, only naked eye objects were categorised (excluding the Sun), the brightest Planets were classified as magnitude 1, and the faintest objects were magnitude 6, the limit of the human eye. Each level of magnitude was considered to be twice the brightness of the previous; therefore magnitude 2 objects are twice as bright as magnitude 3 objects. This is a logarithmic magnitude scale.
With the invention of the telescope and other observational aids the number of new objects soared and a modification was needed to the system in order to accurately categorise so many new objects. In 1856 Norman Robert Pogson formalised the magnitude scale by defining that a first magnitude object is an object that is 100 times brighter than a sixth magnitude object, thus a first magnitude star is 2.512 times brighter than a second magnitude object.
Pogson's scale was originally fixed by assigning Polaris a magnitude of 2. Astronomers later discovered that Polaris is slightly variable, so they first switched to Vega as the standard reference star, and later again switched to using tabulated zero points for the measured fluxes. This is the system used today.
Two Magnitude Scales
Going back to star A and star B, let's say that star A is magnitude 2 and star B is magnitude 3. According to the magnitude scale, star A would appear to be 2.512 times as luminous than star B. Here we are referring to the stars Apparent Magnitude, that is, its brightness as seen from Earth. This is how most magnitudes are presented on TV, planetarium software and magazines.
But how do we know that Star A is actually brighter than Star B? It is entirely possible for Star A and Star B to have the same luminosity, but star B could be further away than star A, thus appears dimmer to us from Earth.
We need another scale which compares the actual brightness of a star if it were a fixed distance from the Earth. This scale is called the Absolute Magnitude and the fixed distance is set at an internationally agreed 10 parsecs. A parsec is the distance from the Earth to an astronomical object which has a parallax angle of one arcsecond (1/3,600 of a degree). We will cover parallax in another article, but for now 1 parsec is equal to 3.26 light-years or 1.92 x 1013 miles.
Absolute Magnitude is given the symbol M, while Apparent Magnitude is given lowercase m.
Our Sun has an apparent magnitude of -26.73, which is easily makes it the brightest object visible in the sky, however the Sun would not be as bright if it was 10 parsecs away. At this distance it would only shine at a mere apparent magnitude of 4.6, so it would be quite faint in the night sky. At 10 parsecs the Sun's magnitude is called the Absolute Magnitude.
Sirius is the next brightest star in the sky has an apparent magnitude of -1.47, however it only lies 2.64 parsecs away so it is relatively close. If it was moved to a standard 10 parsecs away it would be absolute magnitude 1.4, that's 8 times brighter than our Sun at the same distance.
Here's a quick way of remembering the difference between absolute and apparent magnitude:
Apparent magnitude appears to be brightest, Absolute magnitude absolutely is the brightest.
Here is a more technical comparison between apparent and absolute magnitude which looks at the mathematics behind the magnitude scale and how distance modulus effects them.