Monday, January 30, 2012

How is apparent brightness of a star measured?

How do astronomers measure the apparent brightness of a star? Apparent brightness is how bright the star seems from earth. I would like a simple answer. No long formula please.How is apparent brightness of a star measured?The brighter the object appears, the lower the numerical value of its magnitude.



Very bright objects have negative magnitudes. For example, Sirius, the brightest star of the celestial sphere, has an apparent magnitude of 鈭?.46.





In the early Greek system, the brightest stars were called "Magnitude 1" and the faintest were called "Magnitude 6". They did this because, to the Greeks, 1 was the perfect number, 2 was less perfect than 1, and so on.





Modern astronomers have fixed the scale so that a star which is 5 magnitudes brighter than another star will have 100 times more flux. For example, the full Moon has an apparent magnitude of about -12.5 and Mars, at its brightest has an apparent magnitude of about -2.8. This means that the full Moon is about 10 (-12.5 - (-2.8)) magnitudes brighter than the maximum brightness of Mars.How is apparent brightness of a star measured?Originally, apparent magnitudes were measured with the naked eye, comparing the star being studied with certain standard stars with known magnitudes. Later, photography was used, being more accurate and less subjective. In addition, large numbers of stars could be dealt with quickly. Modern methods use photoelectric devices that actually measure the quantity of light reaching the Earth from a star.



HTH



CharlesHow is apparent brightness of a star measured?As simple as it can get. Now everything is measures with what equates to a light meter such as on a camera. It is way more sophisticated. It automatically the amount of light hitting the instrument.

No comments:

Post a Comment