Pillars of Creation: Using the Hubble Palette
I finally got to image two nights of data from my observatory in New Mexico. I chose M16, the Eagle nebula, visible above the southern horizon near Scutum in the Milky Way. I set up and used my new full-frame camera (QHY600 Mono) along with a set of new narrow-band filters (Chroma). I decided to shoot both RGB colors (separate capture with each color filter) and in addition, I took about 2 hours of narrow-band emission (H-alpha, OIII and SII) for each wavelength.
What I had in mind was to create an image using the “SHO Hubble Palette” technique to reveal the structure of the “Pillars of Creation” inside M16.
The Hubble Palette, named for an image processing technique done by the Hubble Space Telescope team, creates what is called “false color” imaging by using narrow-band filters and assigning the data captured with each narrow-band-filtered channel to one of the red, green, or blue colors in an RGB image. SHO refers to the first letters of the narrow-band filters—SII, Ha, OIII. The technique and its use of false colors help to see very well the Pillars’ overall structure despite having an imaging system of 850mm focal length (at about f/6.3).
The first night, I focused to get as much H-alpha data as possible because this was going to be the main source of luminosity data for my composition. I decided not to image any “traditional” luminosity data (a.k.a. using a clear UV/IR filter), but instead to get the narrow-band H-alpha for tighter stars and using the SHO Hubble Palette on top of that Ha data. In lay terms, my intention was to do a Ha-SHO image with RGB stars—Ha for Luminosity, SHO for Hubble Palette (see color distribution below), and RGB just for the stars to show their real color.
On the second night, I captured RGB for the star colors to be re-injected into the narrow-band composition. The moonrise being around 1:00am local time, I had not a lot of time to get that RGB broad-band data. On top of that, tracking error and USB 3.0 cable woes limited the data that could be used in processing. (Tip: connect the camera directly to the computer, not through the mount!) Then I got the last two narrow-bands needed: OIII and SII. But because my observatory is not yet motorized, I could image for only about one hour 30 minutes at a time! I had to wake up, and move the dome by hand [to reset] for the next hour and 30 minutes. (Can’t wait to have it automated!)
Just before leaving for the airport, I scrambled to capture my flats of that last night. Flats are used to eliminate the vignetting and the dust spots clearly visible on my optics. No time to take dark frames for my narrow-band…so in APP (Astro-Pixel Processor) I simply disabled the calibration warnings and forced it to use my lower-gain (ISO) darks for those narrow-band frames. Why use different gain for narrow-band (NB) vs. broad-band (BB)? It is highly recommended to increase the camera gain for NB, as its signal is much weaker (less light hitting the sensor). So, for my system, I double the gain between BB and NB. Just check your histogram whatever system you are using to make that determination. Here all my frames are 5 minutes long.
Once all the data was on my processing laptop, I decided to immediately catalog it, given my standard folder structure, and keep information on what I shot and the possible issues with my data. I fired up APP and in about an hour and 30 minutes it rendered my 50 flats frames, 48 NB light frames, 19 BB color light frames all at 60MP (9576 x 6388 pixel) resolution totaling 14GB of RAW data over two nights (!).
After APP calibrated, aligned, and pre-processed each filter separately, I could import all into Pixinsight and start the meat of my work. A great alternative to tedious masking in Pixinsight is to use the StarNet++ AI-based program to remove and/or select the stars from each of your channels (a.k.a. each “band”). After struggling with incompatibility between the latest Mac OS Catalina and the latest Pixinsight, I found a solution and posted it on the Pixinsight forum for all to use. I basically hacked and disabled the MacOS Pixinsight executable inner security so it can recognize the StarNet module.
Anyway, I was able to create a starless image of my Ha, OIII and a SII narrow-bands. This allowed me to create the SHO Hubble mix as such without any stars (also labeled modified Hubble Palette):
- Red channel : 80% SII + 20% Ha
- Green Channel : 100% Ha
- Blue Channel : 100% OIII
Note that I processed my Ha using deconvolution (removing most of the atmospheric blur) and used noise removal. That allowed me to use it again in order to do an LRGB combination: Ha as luminosity layer and RGB (SHO) as described above. It started to look really nice, but the stars were not looking good (weird purple). I then applied star masks again (thanks, StarNet!) and I was able to simply copy over the broad-band RGB stars (photo-calibrated prior) onto my narrow band composite, and that was it!
I finalized some local contrast enhancements and color saturation treatments using Curves Transformation process. A touch of Dark structure enhanced script and exported it to Photoshop for some minimum Camera RAW adjustments.
Looking at my poor RAW data (especially RGB integration with loads of noise and bloated stars) and the final resulting image, I was able to save the day for that image thanks to proven post-processing techniques. Next time, I will attempt to get much more reliable and quality RAW data and hopefully spend less time post-processing! I know it’s wishful thinking….
Hope this lengthy post might be helpful for some of you and inspire others!
Capture Details
- Narrow-band light frames: 4 hours (5 minutes per frame)
- Broad-band (RGB) frames: 1 hour 35 minutes (5 minutes per frame)
- Scope: Astro-Physics 130mm f/6.3 refractor with flattener (850mm focal length)
- Camera: QHY600 Mono with Chroma filters
- Mount: CEM 120
- Sleepless Astro-nut on duty
- Repository with all sub-images discussed in the post above
Antoine is a member of the Astrophotography group. You can see more photos on Instagram @eyeontheuniverse and on his website, Eye on the Universe.