- cross-posted to:
- pics
- cross-posted to:
- pics
I misread the title as “The Little Dumbass Nebula” and thought to myself “that’s pretty harsh, I’m sure they’re trying their best.”
He’s just like me fr
The Little Dumbbell Nebula gets its name because it kinda looks like a tinier version of the Dumbbell Nebla M27 (yes, a different palette was used for this pic). It’s really tiny compared to the uncropped FOV. I’m a lot happier with this attempt at it, compared to my 2019 pic of M76 with the same equipment. I know It’s a bit out of season rn but I needed something to shoot at the start of the night. The nebulosity itself is false color, but the stars are true color RGB. Captured over 10 nights in Feb/Mar 2024 from a bortle 9 zone (I could only get a couple hours max per night on it.
Places where I host my other images:
-
TPO 6" F/4 Imaging Newtonian
-
Orion Sirius EQ-G
-
ZWO ASI1600MM-Pro
-
Skywatcher Quattro Coma Corrector
-
ZWO EFW 8x1.25"/31mm
-
Astronomik LRGB+CLS Filters- 31mm
-
Astrodon 31mm Ha 5nm, Oiii 3nm, Sii 5nm
-
Agena 50mm Deluxe Straight-Through Guide Scope
-
ZWO ASI-290mc for guiding
-
Moonlite Autofocuser
Acquisition: 21 hours 6 minutes (Camera at -15°C), NB exposures at unity gain and BB at half unity
-
Ha - 99x360"
-
Oiii - 83x360"
-
R - 101x60"
-
G - 100x60"
-
B - 99x60"
-
Darks- 30
-
Flats- 30 per filter
Capture Software:
- Captured using N.I.N.A. and PHD2 for guiding and dithering.
PixInsight Preprocessing:
-
BatchPreProcessing
-
StarAlignment
-
ImageIntegration per channel
-
DrizzleIntegration (2x, Var β=1.5)
-
Dynamic Crop
-
DynamicBackgroundExtraction
duplicated each image and removed stars via StarXterminator. Ran DBE with a shitload of points to generate background model. model subtracted from original pic using the following PixelMath (math courtesy of /u/jimmythechicken1)
$T * med(model) / model
Narrowband Linear:
-
Blur and NoiseXTerminator
-
StarXterminator to completely remove stars (to be later replaced by the RGB ones)
-
ArcsinhStretch to slightly stretch nonlinear
-
iHDR 2.0 script to stretch each channel the rest of the way.
This is a great new pixinsight script from Sketch on the discord. here’s the link to the repo if you want to add it to your own PI install.
RGB Linear:
-
ChannelCombination to combine monochrome R G and B frame into color image
-
SpectroPhotometricColorCalibration
-
BlurXTerminator for star sharpening
-
HSV Repair
-
StarXterminator to generate a stars-only image
-
ArcsinhStretch + HT to stretch nonlinear (to be combined with starless narrowband image later)
Nonlinear:
- PixelMath to combine stretched Ha and Oiii images into color image (/u/dreamsplease’s palette)
R = iif(Ha > .15, Ha, (Ha*.8)+(Oiii*.2))
G = iif(Ha > 0.5, 1-(1-Oiii)*(1-(Ha-0.5)), Oiii *(Ha+0.5))
B = iif(Oiii > .1, Oiii, (Ha*.3)+(Oiii*.2))
-
NoiseX again
-
Shitloads of Curve Transformations to adjust lightness, hues, contrast, saturation, etc
-
LocalHistogramEqualization
-
UnsharpMask
-
More curves
-
ColorSaturation to slightly desaturate the purples
-
even more curves
-
Pixelmath to add in the stretched RGB stars only image from earlier
This basically re-linearizes the two images, adds them together, and then stretches them back to before
(again, credit to Jimmy independent starless processing stuff)
mtf(.005,
mtf(.995,Stars)+
mtf(.995,Starless))
-
Couple final curves
-
DynamicCrop waaaay in on the nebula
-
Annotation
As a beginner, seeing this information really helps. Thanks for including it all and you’ve made a fantastic picture.
I have a couple questions if you don’t mind: what does NB and BB mean and regarding the number of calibration frames, do those numbers represent your choice at imaging time or what remained after excluding any that you or the software didn’t like? Thanks!
NB = narrowband (the Ha and Oiii filters) and BB = broadband (the RGB filters). Because I’ve got a lot of light pollution I use a lower gain for the broadband filters since they let a lot more light through.
I only count exposures post-rejection. The morning after imaging I’ll look through the pics and delete them if it was cloudy, out of focus, not centered, or if the stars trailed. The rest go on to get calibrated and stacked for the final image.
Thanks!
-
Awesome photography! But I have a question…
Regarding space-time adjustments to what you have there, and the fact that we’re already seeing it…
Doesn’t that mean the explosion radius is way larger than it appears?
I get your point, and yes, it’s already moved from where it was when it was imaged. What you could see would depend on how far away you are from it. If you could now, suddenly, magically be at the edge when the photo was taken it would be some where else. Everywhere we look we’re seeing into the past.
You missed my point altogether. It doesn’t matter if the source of the event moved at all or not, I’m talking about the shockwave through the universe.
Once we can actually see the radius of a supernova explosion, well that radius is long ago in the past, and whatever dangerous gamma rays or neutrinos or whatever have probably long ago passed us.
Time dilation yo.
I got your point; I think you just argued my point. Nothing we image is as it “is” but rather as it “was”.