How do you measure the brightness of stars?
A star’s brightness in the sky is referred to as its Magnitude, which is a number with no unit. In its original ancient form, the magnitude system ranked stars from 1 to 6, with the bigger numbers representing faint stars, so that the faintest star we can see has a magnitude of 6, and a very bright star would have a magnitude of 1. Since then, a bit of mathematical rigour has been applied to make the system very precise and consistent – it is now a logarithmic scale, with each increase in magnitude equivalent to a decrease in brightness of about 2.5 times. The brightest star (apart from the Sun) is Sirius, with a magnitude of -1.47, and the dimmest stars visible to the naked eye are around magnitude 6.5 (depending on the condition of your eyes, light pollution, and other factors). The planet Venus can shine as brightly as magnitude -4.89 and the Sun itself shines with a magnitude of almost -27, which is partly what makes it so dangerous to observe directly. The magnitude of a star used to be measured by taking long exposure photographs and measuring the diameter of each star’s image, but modern observatories use sensitive light detection apparatus to measure the brightness directly.
Comments? Questions? Why not mail me at [email protected]