What Is Apparent Magnitude and How Do Astronomers Measure a Star’s Apparent Magnitude?

Apparent magnitude, or how bright a star appears when we look at it, is always defined in comparison to another star.

An increase in apparent magnitude by a factor of 1 means that a star is 2.5 times brighter than another star.

The star Rigel appears brighter than the star Pollux; Rigel has an apparent magnitude of 0.12 and Pollux’s apparent magnitude is 1.14. Rigel appears approximately 2.5 times brighter than Pollux (1.14 – 0.12 = 1.02).

Stars that have negative magnitude numbers are brighter than stars with positive magnitude numbers. For instance, the following stars are in order from brightest to faintest in terms of apparent magnitude: the Sun, (-27.72), Alpha Centauri (-0.27), and Altair (0.77).

Usually, the closest star appears to be the brightest star and the furthest appears to be the faintest: for example, the Sun is 93 million miles away, Alpha Centauri is 4.3 light-years away, and Altair is 16 light-years away.

Sometimes a very bright star that is far away will outshine a closer star: Canopus, 80 light-years away, has an apparent magnitude of -0.72, far brighter than nearby Alpha Centauri.

Rigel, in the constellation Orion, lies some 900 light-years away from Earth. In Gemini, the star Pollux is about 40 light-years away.

The fact that Pollux outshines Rigel is not surprising, given the difference in their distances.

If Pollux and Rigel were both at the same distance from Earth, however, Rigel would be about twice as bright as Pollux.