• Question: How is a star's magnitude measured. I know it gets a value like -0.1 or 2.5, but how do you get to these numbers? What is magnitude measured in?

    Asked by 08wjarvis to Evan on 20 Jun 2011.
    • Photo: Evan Keane

      Evan Keane answered on 20 Jun 2011:


      Great question! Well it all started with the Greeks, as many things did. They had a look at all the stars and decided they would give them a magnitude. The said the brightest ones were magnitude 1, fainter ones were then 2, 3, 4, 5 and the faintest ones they could see with their eyes they called magnitude 6. So it was a not very precise. Then astronomers started measuring things more precisely and realised that stars of magnitude 1 were about 100 times brighter than magnitude 6 ones so they changed what magnitudes meant to make it exactly like this. So a difference of 5 magnitudes means a difference of 100 in brightness. There are also things which are much brighter than 1. Some stars are about 0 and the planets can be very bright sometimes. Iridium flares can be -9 (10,000 brighter than a very bright star!) – have you ever seen one? They are cool! There is a chance to see them almost every night. The moon is -12, the Sun is -26.

      Extra expert mathsy bit … a difference of 5 magnitudes is a difference of 100 in brightness, but a difference of 10 magnitudes is NOT a difference of 200, it is a difference of 100×100 =10,000. This is because the scale is logarithmic (because our eyes behave in this way) so to get the difference in brightness you do 10 to the power of -(m2-m1)/2.5 and that gives you the brightness of star2 divided by brightness of star1.

Comments