Skip to content

Lunchtime Photo

Today is a do-over. Allow me to explain.

Nebulas in the night sky are basically just clouds of gas made luminous by starlight. They consist of hydrogen alpha gas (red), sulphur (a slightly different shade of red), and oxygen (blue). Serious astrophotographers capture this by using monochrome cameras and a set of narrowband filters that allow only the light from those three gases through. They take one batch of pictures with each of the filters and then combine them to get a color image.

You can combine the colors in various ways, called palettes. The best known is the Hubble palette, which is used by the Hubble telescope and gives its pictures their distinct look.

I can't do this because I have a color camera and those filters are expensive. However, I do have a single dual-band filter that allows only the hydrogen alpha and oxygen through. This helps the colors pop and reduces light pollution.

But today I discovered by accident that my software can mimic the palette process. You take your set of images captured with the dual-band filter and run it through the software three times, each pass extracting a different color. Then you recombine them using the palette of your choice.

I have no idea how this works. Magic? Algorithms beyond human ken? In any case, the software pulls the images apart and then puts them back together. It's not as good as doing it the real way, but it's better than nothing.

So I did this to yesterday's photo. Remember I mentioned that my image failed to produce the blue color of the bubble at the center of the Bubble Nebula? With a bunch of trial and error plus some photoshopping, I managed to bring it out. I don't think it's very good, but I imagine it could become pretty good once I learn more about this.

The top photo is the recombined image. The bottom photo is the original from yesterday. You'll notice that the top image not only captures the blue bubble, it also captures more definition throughout the rest of the nebula. This seems sort of magical since it's using the exact same data as the other picture. So where does the additional detail come from?

August 31, 2024 — Desert Center, California

9 thoughts on “Lunchtime Photo

  1. Solarpup

    Your CCD has color information. If I had to guess what's going on, the software is applying three narrow(ish) color filters to the info coming from your CCD, and then renormalizes the image values before combining with your chosen color palette. That's really no different than what you'd be doing with the expensive filters, accept that I imagine the filters can be made with narrower bands, and you could more easily adjust the exposure time for each filter so that no one image is noisier than another.

    Does the software allow for you to let it know that you've used a filter?

    1. Kevin Drum

      Yes, that's my guess too. And the software has separate modes depending on whether you used a filter, so it does know.

      1. Solarpup

        We do it in the X-ray, but the energy resolution on CCDs isn't exactly the best, E/Delta E ~ 50 at 6.4 keV (a prominent fluorescence line). That then gets worse as ~E^(1/2) as you go to lower energy. But, we do what we can and make 3 color images around prominent emission lines, and then just throw out the in between energies. It works pretty well for things like supernova remnants.

        Calorimeters with pretty amazing (by X-ray standards) spectral resolution of ~1200 are coming on line, but the spatial resolution isn't there yet.

        Folks in the IR, with Integral Field Units, get the best of both worlds these days.

        It's kind of cool that the software can account for your filter. It be interesting to see if it does better with or without the filter -- a competition between filtering out unwanted light vs. reducing your overall intensity.

    2. Greg_in_FL

      Yes, Kevin's CCD gives three numbers for each pixel (call them R, G, and B, for simplicity). But then the post-processing software Kevin tried out knows what the bandpass characteristics of his filter is, so it can modify the R, G, and B numbers to back out what the unfiltered values (call them R*, G*, and B*) would be. Then it applies the palette to represent those using another (e.g. Hubble) mapping of values R*, G*, and B* to displayed colors.

      The monochrome camera combined with narrowband filters allows the astrophotographer to home in on the actual (narrow line) emission from the nebula, and so does a better job at rejecting light pollution. But it's quite a bit more work, both in the field and in post processing. It takes three times the observation time, and requires a filter wheel to swap filters.

      One ironic and clever side effect of the monochrome/narrowband approach is that it allows the observer to use a cheap scope. You don't need a $5000 apochromatic scope if you're looking at monochromatic images, you just have to refocus carefully between exposures with different filters. A decent $500 doublet scope will do. But still, when you go down the rabbit hole to astrophotography, you're going to invest a ton of money and especially time anyway.

    1. Kevin Drum

      Yes, it's definitely oxygen (doubly ionized oxygen, OIII, technically). As for why it's bubble shaped, I don't know. Just an expanding cloud, I suppose.

  2. Winslow2

    I thought the original image was amazing, but the recombined version is spectacular. I could look at that all day. If you ever decide to sell prints of your photos....

Comments are closed.