When looking at the night sky, 1 is bigger than 6; here’s why

Greetings, stargazers. This is the time of year when the brightest part of the Milky Way is rising in the east. Although the Milky Way indeed contains billions of stars, how many of those stars can you actually see with your naked eye when you go outside? Certainly more in the Four Corners than in any big city, but likely not as many as you might guess.

In a dark part of the county, you can probably see around 4,000 stars if you could count them all. Of course, this depends on how dark it is and how good your eyes are, but this is a good ballpark figure for most people. The number of stars you can see depends on what is called the limiting magnitude for your location.

The term “limiting magnitude” refers to the magnitude of the dimmest object you can see from a given location. In another column, I can explain how you can measure your limiting magnitude, but this month I will be writing about the star magnitude scale in general.

The historical magnitude scale dates back at least to the time of Hipparchus and Ptolomy, over 2,000 years ago. That historical system labeled the brightest stars in the sky as “of the first magnitude,” or as I like to think, “first-class stars.” Those that weren’t quite as bright got to be “second magnitude,” and so on, until the dimmest ones that could be seen were labeled “sixth magnitude.” Categorizing the stars like this was very subjective.

Today’s scale keeps many of the quirks of the ancient scale, including one that many newcomers to astronomy find difficult to grasp. This is the continued use of an “inverted” scale – smaller numbers mean brighter stars. The modern scale expands the original range of magnitudes to numbers higher than six for dimmer stars, and lower, even negative numbers, for brighter stars. And all values, not just integers, are allowed, so very small differences can be distinguished.

The second quirk reflects the fact that human vision detects light logarithmically. This is why you can see differences in very bright things, and also differences in things that might be thousands of times dimmer. This modern scale was set up where five magnitudes is defined to be a difference of exactly a factor of 100 in brightness.

With this definition, each magnitude is different from the next by a factor of 2.512x in brightness. (For the math geeks reading this column, 2.512 is the fifth root of 100, or 1,000.2.) Stars two magnitudes apart would be ~2.5x (for the first magnitude) times ~2.5x (for the second magnitude), or about 6.3 times total difference in brightness.

On the modern magnitude scale the full moon is apparent magnitude -12.9, and the sun is -26.7. The dimmest galaxy visible through the biggest telescope is greater than magnitude +30, which is about 1,022 times dimmer than the sun. That is a 1 followed by 22 zeroes, which is quite the range of brightnesses we can detect.

This month

Right after dusk, bright Arcturus is high overhead, and equally bright Vega is rising in the northeast. These are the second- and third-brightest stars visible from Durango. (Sirius, the brightest, is out during the day right now.) For many years, Vega was used as the “standard” reference star for the magnitude scale. Modern instruments can detect that Vega varies slightly from a defined magnitude of 0.0. The other really bright objects visible, besides the moon, are Mars and Saturn. To see lots of stars at the dimmer end of visibility, look toward the Milky Way. With binoculars it is easy to see that those faint, glowing regions really are just lots of dim stars. Take a look with binoculars and you will be able to far surpass the 4,000 star limit I mentioned above.

hakes_c@fortlewis.edu

Charles Hakes is an assistant professor in the physics and engineering department at Fort Lewis College and is director of the Fort Lewis Observatory.