When astronomers began to accurately measure the brightness of stars using instruments, it was found that each magnitude is about 2.5 times brighter than the next greater magnitude. This means a difference in magnitudes of 5 units (from magnitude 1 to magnitude 6, for example) corresponds to a change in brightness of 100 times. With equipment to make more accurate measurements, astronomers were able to assign stars decimal values, like 2.75, rather than rounding off to magnitude 2 or 3. There are stars brighter than magnitude 1. The star Vega (alpha Lyrae) has a visual magnitude of 0. There are a few stars brighter than Vega. Their magnitudes will be negative.
Astronomers usually refer to 'apparent magnitudes', that is, how bright a star appears to us here at Earth. Apparent magnitudes are often written with a lower case 'm' (like 3.24m). The brightness of a star depends not only on how bright it actually is, but also on how far away it is. For example, a street light appears very bright directly underneath it, but not as bright if it's 1/2 a mile away down the road. Therefore, astronomers developed the 'absolute' brightness scale. Absolute magnitude is defined as how bright a star would appear if it were exactly 10 parsecs (about 33 light years) away from Earth. For example, the Sun has an apparent magnitude of -26.7 (because it's very, very close) and an absolute magnitude of +4.8. Absolute magnitudes are often written with a capital (upper case) 'M'.
Our server costs have gone up and our advertising revenue has gone down. You do the math! If you find our site useful, consider donating to keep us going. Thanks!
'Nothing in this world is to be feared... only understood.'