The Astronomical Magnitude Scale
- Introduction to Astronomy
- The Celestial Sphere - Right Ascension and Declination
- What is Angular Size?
- What is the Milky Way Galaxy?
- The Astronomical Magnitude Scale
- Sidereal Time, Civil Time and Solar Time
- Equinoxes and Solstices
- Parallax, Distance and Parsecs
- A Newbie's Guide to Distances in Space
- Luminosity and Flux of Stars
- Kepler's Laws of Planetary Motion
- What Are Lagrange Points?
- Glossary of Astronomy & Photographic Terms
- Astronomical Constants
The visual brightness of stars, planets and other astronomical objects is based on the visual magnitude scale. We look at this scale and how astronomers use it to measure relative brightnesses of objects in the night's sky.
The brightness of an object is a basic observable quantity. It is easy to observe two stars and say that star A is brighter than star B, but it would be handy if we had a way of quantifying this brightness so we can say that star A is x times as bright as star B. To this end the Magnitude Scale was introduced.
History of the Magnitude Scale
The Greek mathematician Hipparchus is widely credited for the origin of the magnitude scale, but it was Ptolemy who popularised it and brought it to the mainstream.
In his original scale, only naked eye objects were categorised (excluding the Sun), the brightest Planets were classified as magnitude 1, and the faintest objects were magnitude 6, the limit of the human eye. Each level of magnitude was considered to be twice the brightness of the previous; therefore magnitude 2 objects are twice as bright as magnitude 3 objects. This is a logarithmic magnitude scale.
With the invention of the telescope and other observational aids, the number of new objects soared and a modification was needed to the system in order to accurately categorise so many new objects. In 1856 Norman Robert Pogson formalised the magnitude scale by defining that a first magnitude object is an object that is 100 times brighter than a sixth magnitude object, thus a first magnitude star is 2.512 times brighter than a second magnitude object.
Pogson's scale was originally fixed by assigning Polaris a magnitude of 2. Astronomers later discovered that Polaris is slightly variable, so they first switched to Vega as the standard reference star, and later again switched to using tabulated zero points for the measured fluxes. This is the system used today.
Two Magnitude Scales
Going back to star A and star B, let's say that star A is magnitude 2 and star B is magnitude 3. According to the magnitude scale, star A would appear to be 2.512 times as luminous than star B. Here we are referring to the stars Apparent Magnitude, that is, its brightness as seen from Earth. This is how most magnitudes are presented on TV, planetarium software and magazines.
But how do we know that Star A is actually brighter than Star B? It is entirely possible for Star A and Star B to have the same luminosity, but star B could be further away than star A, thus appears dimmer to us from Earth.
We need another scale which compares the actual brightness of a star if it were a fixed distance from the Earth. This scale is called the Absolute Magnitude and the fixed distance is set at an internationally agreed 10 parsecs. A parsec is a distance from the Earth to an astronomical object which has a parallax angle of one arcsecond (1/3,600 of a degree). We will cover parallax in another article, but for now 1 parsec is equal to 3.26 light-years or 1.92 x 1013 miles.
Absolute Magnitude is given the symbol M, while Apparent Magnitude is given lower case m.
Our Sun has an apparent magnitude of -26.73, which easily makes it the brightest object visible in the sky, however, the Sun would not be as bright if it was 10 parsecs away. At this distance, it would only shine at a mere apparent magnitude of 4.6, so it would be quite faint in the night sky. At 10 parsecs the Sun's magnitude is called the Absolute Magnitude.
Sirius is the next brightest star in the sky has an apparent magnitude of -1.47, however it only lies 2.64 parsecs away so it is relatively close. If it was moved to a standard 10 parsecs away it would be absolute magnitude 1.4, that's 8 times brighter than our Sun at the same distance.
Here's a quick way of remembering the difference between absolute and apparent magnitude:
Apparent magnitude appears to be brightest, Absolute magnitude absolutely is the brightest.
Example Stars on the Magnitude Scale
In this table, we can see examples of various points on the magnitude scale.
|Sirius (brightest star)||-1.44|
|Naked Eye Limit (urban)||+3|
|Naked Eye Limit (dark skiese)||+6|
|12" Telescope Limit||+14|
|200" Telescope Limit||+20|
|Hubble Telescope Limit||+30|
Apparent Magnitude, Absolute Magnitude and Distance
There are two main types of magnitude commonly used in astronomy. The first of these, apparent magnitude, is the brightness of the object as seen by an observer on the Earth. The apparent magnitude of a star is dependent on two factors:
- The luminosity of the star (total energy per second radiated)
- The distance of the star from Earth
The second, absolute magnitude, is dependent solely on the star's luminosity and can be regarded as an intrinsic property of the star. Absolute magnitude is defined as the apparent magnitude of an object if it were a standard distance from the Earth. The standard distance is 10 parsecs. Since distance is always equal when comparing absolute magnitudes, it can be removed as a factor in the star's brightness which is why it can be regarded as an intrinsic property.
Absolute magnitude and Luminosity
A star's luminosity, L, is the total amount of energy radiated per unit time. The absolute magnitude of a star is related to its luminosity in the same way as apparent magnitude is related to flux. If we compare the ratio of the brightness of two stars, expressed in terms of their luminosities, then we obtain a relation for the difference in their absolute magnitudes.
Equation 23 - Absolute Magnitude Relation
Capital letters are used to indicate absolute magnitudes and lower case letters are used to identify apparent magnitudes.
As we have previously stated, absolute magnitude is the apparent magnitude of an object if it was a distance of 10 parsecs from the Earth.
It is clear from this definition that a star located at 10 parsecs from the Earth will have the same apparent and absolute magnitude. A star that is further away than 10 parsecs will have a fainter apparent magnitude than absolute magnitude and a star that is closer than 10 parsecs will have a brighter apparent magnitude than absolute magnitude.
How do we know stars absolute magnitude? We could travel to every star and measure the apparent brightness from a distance of 10 parsecs, but at the moment that really isn't a practical solution. Luckily for us, however, the apparent and absolute magnitudes are related by a very important formula.
Distance Modulus is the difference between the apparent and absolute magnitudes. This can be obtained by combining the definition of absolute magnitude with an expression for the inverse square law and Pogson's relation. Using the distance modulus it is possible to establish a relationship between the absolute magnitude, M, of a star, its apparent magnitude, m, and its distance, d.
The inverse square law tells us that for a star at distance d (parsecs), with observed flux Fm, then its flux FM at 10 parsecs would be given by:
Equation 24 - Inverse Square Law for Flux
We can combine this with equation 23 above to give the distance modulus equation.
Equation 25 - Distance Modulus
If we measure stars apparent magnitude, and its distance in parsecs is known, then we can determine the absolute magnitude and hence the luminosity of the star. If we know the stars absolute and apparent magnitudes we can use distance modulus to calculate the distance to the star. This equation is very powerful and will be used a great many times in upcoming tutorials.
The formula for calculating Absolute Magnitude within our galaxy is:
Equation 31 - Absolute Magnitude
Where D is the distance to the star in parsecs.
Barnard's Star lays 1.82 parsecs away and has an observed (apparent) magnitude of 9.54.
m - M = 5((log10 D)-1) M = 9.54 * 5((log10 1.82)-1) M = 9.54 - (-3.7) M = 13.24
If Barnard's Star were to be moved to a distance of 10 parsecs from the Earth it would then have a magnitude of 13.24.
If we already know both Apparent and Absolute magnitudes, it is possible to calculate the distance to the star:
d = 100.2(m - M + 5)
Using Barnard's Star again,
d = 100.2(9.54-13.24+5) d = 100.26 d = 1.82 parsecs
Another type of magnitude of interest to astronomers is the bolometric magnitude. So far the absolute and apparent magnitudes are based on the total visible energy radiated from the star. We know that not all of that energy is received on Earth since it is filtered out by our atmosphere.
Bolometric magnitude is a based on the flux throughout the entire electromagnetic spectrum. The term absolute bolometric magnitude is based specifically on the luminosity (or total rate of energy output) of the star.
The bolometric magnitude Mbol, takes into account electromagnetic radiation at all wavelengths. It includes those unobserved due to instrumental pass-band, the Earth's atmospheric absorption, and extinction by interstellar dust. It is defined based on the luminosity of the stars. In the case of stars with few observations, it must be computed assuming an effective temperature.
Equation 39 - Bolometric Magnitude
This can then be reworked to find the ratio of luminosity.
Equation 40 - Luminosity ratio of magnitudes
Last updated on: Wednesday 24th January 2018