Jump to content

How Bright Are Stars and How Many Different Ways Can Astronomers Determine a Star’s Brightness?


Recommended Posts

Posted

here are three ways to talk about a star’s brightness.

 

We can discuss the apparent magnitude of a star, or how bright it appears when we look at it.

 

We know, however, that stars nearer the earth appear brighter than faraway stars, even though they may not actually be brighter.

 

So, astronomers also talk about the absolute magnitude of stars, or the brightness of a star if the observer were 10 parsecs from it.

 

The third way to measure a star’s brightness is called luminosity.

 

Luminosity is a measure of how much energy a star puts out in comparison to our Sun.

Posted (edited)

...

 

The third way to measure a star’s brightness is called luminosity.

 

Luminosity is a measure of how much energy a star puts out in comparison to our Sun.

 

When you don't specify that it is visible luminosity, you mean wattage across the whole EM spectrum (including wavelengths the eye does not detect). That is measured by a bolometer---basically a broadband light meter--then adjusted for distance.

Typical device you isolate the light from the star and see how much absorbing it heats the bolometer.

You give the result in watts or in multiples of the sun's standard luminosity which is around 4x1026 watts.

 

"Magnitude" is a logarithmic way of reporting luminosity. Like you suggest it is just a way of talking about it. For me, the absolute bolometric luminosity (or simply luminosity) is the only real measure. The rest is just ways of talking that have accumulated over time. Astronomers have a long history of accruing different ways of saying the same thing---each one appropriate to some context where it is convenient to talk that way. Like "apparent visible magnitude" is convenient for people out stargazing. They get so they can judge it by eye.

 

Because stars on the "main sequence" that is typical stars not near the end of life follow a very regular pattern you can GUESS a star's luminosity simply from its color---its spectral lines (you know what I mean, the type of rainbow it makes).

That is not measuring. That is guessing, based on past experience with other stars of the same color pattern where we already made a real measurement of luminosity. Estimating luminosity from the color pattern of the star is a whole other thing.

 

 

To me, well I take this very simple view that the basic concept is actual energy output= wattage = luminosity, and there is only ONE way to determine it. Measure with a bolometer, plus then you have to know the DISTANCE.

 

 

So the really interesting quantity to estimate is distance.

 

There is a ladder of different ways of estimating distance and you may know something about that. For the nearest stars, parallax. Then the open cluster method. And Cepheids. And supernovae. And the HR diagram being used along the way as a kind of check. I'm probably forgetting some of the rungs in the distance ladder. That is one of the most interesting basic topics in astro.

 

EDIT: I found the Wikipedia on luminosity. WiPi is not always so good or reliable, but it can sometimes be a big help and this article looked OK to me. I didn't read it all but this part seemed all right:

The luminosity of stars is measured in two forms: apparent (counting visible light only) and bolometric (total radiant energy); a bolometer is an instrument that measures radiant energy over a wide band by absorption and measurement of heating. When not qualified, luminosity means bolometric luminosity, which is measured in the SI units watts, or in terms of solar luminosities, ; that is, how many times as much energy the object radiates than the Sun, whose luminosity is 3.846×1026 W.

Luminosity is an intrinsic measurable property independent of distance, and is appraised as absolute magnitude, corresponding to the apparent luminosity in visible light of a star as seen at the interstellar distance of 10 parsecs, or bolometric magnitude corresponding to bolometric luminosity. In contrast, apparent brightness is related to the distance by an inverse square law. Onto this brightness decrease from increased distance comes an extra linear decrease of brightness for interstellar "extinction" from intervening interstellar dust. Visible brightness is usually measured by apparent magnitude. Both absolute and apparent magnitudes are on an inverse logarithmic scale, where 5 magnitudes increase counterparts a 100:th part decrease in nonlogaritmic luminosity.

By measuring the width of certain absorption lines in the stellar spectrum, it is often possible to assign a certain luminosity class to a star without knowing its distance. Thus a fair measure of its absolute magnitude can be determined without knowing its distance nor the interstellar extinction, and instead the distance and extinction can be determined without measuring it directly through the yearly parallax. Since the stellar parallax is usually too small to be measured for many far away stars, this is a common method of determining distances.

In measuring star brightnesses, visible luminosity (not total luminosity at all wave lengths), apparent magnitude (visible brightness), and distance are interrelated parameters. If you know two, you can determine the third. Since the sun's luminosity is the standard, comparing these parameters with the sun's apparent magnitude and distance is the easiest way to remember how to convert between them.

Edited by Martin

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.