By Malcolm Gibb

What does it mean? Well, to a mathematician it means the length of a vector when represented in line segment form. To a geologist it is the measure of the total energy released by an earthquake, the Richter scale uses magnitudes to define the intensity of an earthquake. It can also mean the absolute value of a number, that is when the + or - signs are removed and the number is symbolised using two vertical straight lines, one either side of the number. It is also used in everyday language to mean the size or importance of something.

But we, being good astronomers, know it only means one thing, the optical brightness of a celestial object. The Greeks started it all back in the second century BC by deciding that the first stars visible after sunset were of the first magnitude and the last were of the sixth magnitude. In the 1850's a Norman Podgson tinkered with the system and the scale became logarithmic, with a difference of five magnitudes meaning a brightness difference of one hundred times. This means a difference of one magnitude corresponds to a brightness difference of about 2.51 (the fifth root of one hundred), so a second magnitude star is 2.51 times fainter than a first magnitude star and a third magnitude star is 2.51 times fainter than a second magnitude star, and so on.

In modern times, the scale has been extended in both directions, so magnitude -3 is brighter than magnitude 0 which is in turn brighter than magnitude +3, therefore the lower the value the brighter the object. What a pity money doesn't follow the same rules!

Here are some examples of magnitude values for well-known objects
Sun-26.7 (about 400,000 times brighter than full Moon)
Full Moon-12.7
Brightest Iridium flares-8 (see other article)
Venus (at its brightest)-4.4
Sirius (brightest star)-1.44
Mir space station-1 (variable)
Limit of human eye+6 (obviously will vary with individual)
Limit of 10 x 50 binoculars+9