Astronomy compels the soul to look upwards and leads us from this world to another.

The Magnitude Scale

How the visual magnitude scale works in astronomy, how it relates the visual brightness of stars and why it's useful.

Written By on in Astronomy 0

The Magnitude Scale

658 words, estimated reading time 3 minutes.

The visual brightness of stars, planets and other astronomical objects is based on the visual magnitude scale. We look at this scale and how astronomers use it to measure relative brightnesses of objects in the night's sky.
Introduction to Astronomy Series
  1. Introduction to Astronomy
  2. The Celestial Sphere - Right Ascension and Declination
  3. What is Angular Size?
  4. What is the Milky Way?
  5. The Magnitude Scale
  6. Sidereal Time, Civil Time and Solar Time
  7. Equinoxes and Solstices
  8. Parallax, Distance and Parsecs
  9. Flux
  10. Luminosity of Stars
  11. Apparent Magnitude, Absolute Magnitude and Distance
  12. Variable Stars
  13. Spectroscopy and Spectrometry
  14. Redshift and Blueshift
  15. Spectral Classification of Stars
  16. Hertzsprung-Russell Diagram
  17. Kepler's Laws of Planetary Motion
  18. The Lagrange Points
  19. What is an Exoplanet?
  20. Glossary of Astronomy & Photographic Terms

The brightness of an object is a basic observable quantity. It is easy to observe two stars and say that star A is brighter than star B, but it would be handy if we had a way of quantifying this brightness so we can say that star A is x times as bright as star B. To this end the Magnitude Scale was introduced.


The Greek mathematician Hipparchus is widely credited for the origin of the magnitude scale, but it was Ptolemy who popularised it and brought it to the mainstream.

In his original scale, only naked eye objects were categorised (excluding the Sun), the brightest Planets were classified as magnitude 1, and the faintest objects were magnitude 6, the limit of the human eye. Each level of magnitude was considered to be twice the brightness of the previous; therefore magnitude 2 objects are twice as bright as magnitude 3 objects. This is a logarithmic magnitude scale.

With the invention of the telescope and other observational aids, the number of new objects soared and a modification was needed to the system in order to accurately categorise so many new objects. In 1856 Norman Robert Pogson formalised the magnitude scale by defining that a first magnitude object is an object that is 100 times brighter than a sixth magnitude object, thus a first magnitude star is 2.512 times brighter than a second magnitude object.

Pogson's scale was originally fixed by assigning Polaris a magnitude of 2. Astronomers later discovered that Polaris is slightly variable, so they first switched to Vega as the standard reference star, and later again switched to using tabulated zero points for the measured fluxes. This is the system used today.

A few common visual magnitudes
A few common visual magnitudes

Two Magnitude Scales

Going back to star A and star B, let's say that star A is magnitude 2 and star B is magnitude 3. According to the magnitude scale, star A would appear to be 2.512 times as luminous than star B. Here we are referring to the stars Apparent Magnitude, that is, its brightness as seen from Earth. This is how most magnitudes are presented on TV, planetarium software and magazines.

But how do we know that Star A is actually brighter than Star B? It is entirely possible for Star A and Star B to have the same luminosity, but star B could be further away than star A, thus appears dimmer to us from Earth.

We need another scale which compares the actual brightness of a star if it were a fixed distance from the Earth. This scale is called the Absolute Magnitude and the fixed distance is set at an internationally agreed 10 parsecs. A parsec is a distance from the Earth to an astronomical object which has a parallax angle of one arcsecond (1/3,600 of a degree). We will cover parallax in another article, but for now 1 parsec is equal to 3.26 light-years or 1.92 x 1013 miles.

Absolute Magnitude is given the symbol M, while Apparent Magnitude is given lower case m.

Our Sun has an apparent magnitude of -26.73, which easily makes it the brightest object visible in the sky, however, the Sun would not be as bright if it was 10 parsecs away. At this distance, it would only shine at a mere apparent magnitude of 4.6, so it would be quite faint in the night sky. At 10 parsecs the Sun's magnitude is called the Absolute Magnitude.

Sirius is the next brightest star in the sky has an apparent magnitude of -1.47, however it only lies 2.64 parsecs away so it is relatively close. If it was moved to a standard 10 parsecs away it would be absolute magnitude 1.4, that's 8 times brighter than our Sun at the same distance.

Here's a quick way of remembering the difference between absolute and apparent magnitude:

Apparent magnitude appears to be brightest, Absolute magnitude absolutely is the brightest.

Here is a more technical comparison between apparent and absolute magnitude which looks at the mathematics behind the magnitude scale and how distance modulus affects them.

Last updated on: Thursday 20th July 2017

Did you Like this Post? Why not Like us on Facebook?


Further Reading

There are no comments for this post. Be the first!

Leave a Reply

Your email address will not be published.