Blotchy images after noise reduction [Deep Sky] Processing techniques · Derek Vasselin · ... · 19 · 842 · 7

chroniclesofthecosmos 1.51
...
· 
·  Share link
I've noticed lately I struggle with blotchy images during processing. Specifically after using NoiseXTerminator. You can see what I'm referring to in the top left corner of the image below.

I am using a ZWO ASI2600MC, with an HaOiii dual-band filter and an SiiOiii dual-band filter. After stacking each group separately, I'm extracting the RGB channels and recombining to create an SHO image.

The image below is auto-stretched for editing and is 32 bit.

Process so far: crop > PCC > DBE > BXT > NXT.

The HaOiii image is about 11 hrs and the SiiOiii is about 18 hrs.

I thought in prior images the issue was a lack of integration time, but I've got quite a bit here. 

The only other thing I can think of is that mixing 2 dual-bands with a color camera causes weird issues with noise? But a monochrome camera is currently out of the budget.

Anyone know any workarounds for this? Is the answer still just 'more' integration time?

image.png
Like
SkyHoinar 0.90
...
· 
·  1 like
·  Share link
Does this persist also after stretching the image?
Like
ONikkinen 4.79
...
· 
·  3 likes
·  Share link
Make sure you have the higher precision STF view on. You can get this kind of effect with an autostretched STF applied without it and in reality there was no data issue.

If its a data issue then not sure what the culprit is, but try the above first.

And also, 0.9 NXT is way too much, dial it back a bit. You are not trying to make noise go away with it and turn the image into a painting which is what happens at 0.9.
Edited ...
Like
daniele.borsari 5.25
...
· 
·  5 likes
·  Share link
I think this is posterization. See PixInsight own article about this. I recommend you using 24bit LUTs as STF method:Screenshot 2024-10-16 214203.png
There is also a setting in the Global Preferences under "Miscellaneus Image Window Settings" to use "24-bit Screen Transfer Function LUTs by default".

Daniele
Like
andreatax 9.89
...
· 
·  2 likes
·  Share link
I never used 24bit LUT and never ever had issue with posterization, so I shan't think that is the issue. The issue is bad data processing, specifically way too much NR (and maybe other things besides). And you are wasting time applying NXT to a linear image.
Like
daniele.borsari 5.25
...
· 
·  1 like
·  Share link
andrea tasselli:
I never used 24bit LUT and never ever had issue with posterization, so I shan't think that is the issue. The issue is bad data processing, specifically way too much NR (and maybe other things besides). And you are wasting time applying NXT to a linear image.

Well, in some cases if noise reduction was used heavily and the data was very smooth I could see a little bit of posterization.

Daniele
Like
daveshow07 3.15
...
· 
·  1 like
·  Share link
I feel like I sometimes experience this when applying noise reduction before stretching linear data. Was noise reduction applied before or after a stretch was applied?
Like
jrista 11.18
...
· 
·  4 likes
·  Share link
Derek Vasselin:
I've noticed lately I struggle with blotchy images during processing. Specifically after using NoiseXTerminator. You can see what I'm referring to in the top left corner of the image below.

I am using a ZWO ASI2600MC, with an HaOiii dual-band filter and an SiiOiii dual-band filter. After stacking each group separately, I'm extracting the RGB channels and recombining to create an SHO image.

The image below is auto-stretched for editing and is 32 bit.

Process so far: crop > PCC > DBE > BXT > NXT.

The HaOiii image is about 11 hrs and the SiiOiii is about 18 hrs.

I thought in prior images the issue was a lack of integration time, but I've got quite a bit here. 

The only other thing I can think of is that mixing 2 dual-bands with a color camera causes weird issues with noise? But a monochrome camera is currently out of the budget.

Anyone know any workarounds for this? Is the answer still just 'more' integration time?

image.png

Your denoise is 90... My guess is its just too much NR and you are over-smoothing the data. Noise reduction is a matter of finessing your signal, not obliterating information. Most people seem to take the obliteration path, and there ARE CONSEQUENCES to doing so. 

Pull back on the amount of NR, leave a light bit of noise in the signal, and you shouldn't have any problems.
Like
DalePenkala 19.38
...
· 
·  1 like
·  Share link
Jon Rista:
Derek Vasselin:
I've noticed lately I struggle with blotchy images during processing. Specifically after using NoiseXTerminator. You can see what I'm referring to in the top left corner of the image below.

I am using a ZWO ASI2600MC, with an HaOiii dual-band filter and an SiiOiii dual-band filter. After stacking each group separately, I'm extracting the RGB channels and recombining to create an SHO image.

The image below is auto-stretched for editing and is 32 bit.

Process so far: crop > PCC > DBE > BXT > NXT.

The HaOiii image is about 11 hrs and the SiiOiii is about 18 hrs.

I thought in prior images the issue was a lack of integration time, but I've got quite a bit here. 

The only other thing I can think of is that mixing 2 dual-bands with a color camera causes weird issues with noise? But a monochrome camera is currently out of the budget.

Anyone know any workarounds for this? Is the answer still just 'more' integration time?

image.png

Your denoise is 90... My guess is its just too much NR and you are over-smoothing the data. Noise reduction is a matter of finessing your signal, not obliterating information. Most people seem to take the obliteration path, and there ARE CONSEQUENCES to doing so. 

Pull back on the amount of NR, leave a light bit of noise in the signal, and you shouldn't have any problems.

I will agree with this I always pull it back to around 65-70 and then work with it after my stretch.

Dale
Like
Alexn 12.25
...
· 
·  2 likes
·  Share link
As has been mentioned, this is the result of both posterization, and too aggressive noise reduction. By going so agressive on noise reduction, you've created an incredibly smooth gradient in tones between the darker and lighter areas of nebuloity, and the 16bit lut is known to show posterization when extremely smooth gradients are present.

You may not have seen it before, because you may not have had nebulosity that has such subtle tranitions from dark to light. 

I see this on occasion in some targets, but certainly not in all targets. With 18hrs of data, I dare say you have enough Ha…
Like
AstroRBA 4.98
...
· 
·  2 likes
·  Share link
andrea tasselli:
And you are wasting time applying NXT to a linear image.


Ditto
Like
chroniclesofthecosmos 1.51
Topic starter
...
· 
·  Share link
Appreciate all the comments!

To answer some of the questions I've seen:
  • Yes, the issue persists after stretching.
  • I use the default NXT settings most of the time, but can dial back to something less intense (sounds like it's best to do so AFTER stretching?)
  • I can look into the 24-bit LUT and see if that makes a difference as well
Like
silversilk 1.81
...
· 
·  1 like
·  Share link
I have noticed this issue in my processing for some time. I have not understood the issue to the levels being discussed in this post and had assumed the issue to be the combination of my data set and the aggressiveness of NR application. 

However, recently I have moved away from NXTerminator and began to use Graxperts noise reduction tool. The results are much better. I will typically run it at between .7 and .85 after BXT while still linear and then use GHS to stretch the image. At this point I do the typical things to get the image where I want and then if needed run another pass of graxpert NR at a much milder .5 to.65 to just touch up any remaining noise.  

I had learned early on from some help
video that it was a good idea to do your NR before the initial stretch but am learning from this post that is not correct which may be why I need two passes of NR to get where I want. I will be playing with this and the other recommendations in this post. 

Give Graxpert Noise Reduction script a shot and see if you like the results.
Like
Gondola 8.11
...
· 
·  1 like
·  Share link
I agree, the noise reduction in GraxPert works extremely well and is easy to control, I use it just for that as I really don't like its stretching results and its background extraction is very hit or miss.
Like
CSChurch14 0.90
...
· 
·  Share link
Thanks for this post...been having similar issues so I'll give some of the recommended changes in here a try
Screenshot 2024-10-20 at 3.32.03 PM.png
Edited ...
Like
jrista 11.18
...
· 
·  4 likes
·  Share link
Chris:
Thanks for this post...been having similar issues so I'll give some of the recommended changes in here a try
Screenshot 2024-10-20 at 3.32.03 PM.png

My strongly held opinion here is, this is WAY, WAY too much noise reduction. 

I mentioned earlier in the thread the concept of "obliteration"? This right here is noise obliteration. In fact, its OBLITERATION! 

I know that there are shifting aesthetic preferences in any endeavor like astrophotography. That said, I think there IS a proper way to handle any signal. I think there are often misconceptions that noise and signal are...well, lacking a better term, "orthogonal" to each other (orthogonality between distinct signals is a different thing, but I'm specifically talking about a given signal, and its noise here). By this, I mean, I think people assume noise can be eliminated from a signal, without unduly affecting the signal.

The reality is that the signal IS NOISY. The noise is not orthogonal to the signal...it is an intrinsic aspect of uncertainty IN the signal. An ASPECT OF the signal. It cannot be "eliminated"...or as I've been putting it, OBLITERATED. Not, at least, without having a detrimental effect on the signal itself. 

The mottling, orange peel, otherwise undesired effect in this image here, is the result of attempting to obliterate, or completely remove, noise from the signal.

Approach the problem differently. REDUCE the noise. Don't obliterate it. The noise is an intrinsic part of, an inherent aspect of, the signal. Obliteration decimates the quality of the signal, which is not really a desirable outcome. Instead of obliteration, aim to mitigate the impact of noise, while at the same time trying to affect the quality of the signal as minimally as possible. THE SIGNAL IS NOISY! That's a fact. You can manage the noise, adjust its nature, so that it becomes less apparent, while having a minimal effect on the information you can SEE and OBSERVE within the signal. 

I never try to eliminate the noise, or the "graininess" of my images. The noise is intrinsic, inherent, and a fundamental part of the image. Instead, I try to minimize its visual impact. This comes via reduction. You can think of noise as an amplitude...one pixel might be brighter than it "should" be, the next darker than it "should" be, the next after that pretty close to what it "should" be. We don't actually, and simply cannot actually, KNOW what each pixel really "should" be!! This unknowability, or what we call "uncertainty" in the signal, IS noise! By changing the noise, we are affecting the accuracy of our signals...which are measurements of some amount of light (the signal) at some point in the sky. Even if we just reduce the amplitude of the noise, we are still in fact affecting the signal...you cannot affect noise without affecting signal. You always affect both at the same time. But, reducing the amplitude can mitigate the impact noise has on our ability to see the pretty picture within.

Eliminating (i.e. obliterating) the noise is effectively flattening any meaningful differences between adjacent pixels, and that WILL, ALWAYS, have an effect on the signal...usually a detrimental one. To preserve the signal, you have to reduce the noise to a degree, but not obliterate. 

Say we have something like this
---
|
 |  
 |       ---
| |
| |
|  ---  |
|   |    |
| | |
Obliteration would likely produce something like this:
--- 
| --- ---
 |  |  |
 |   |   |

When in fact, what we really wanted was something more like this:
--- 
 |       ---
| --- |
 |  |   |
 |   |    |
| | |
Reduction. Manage the noise, finesse it, to bring out the pretty picture without destroying it. Don't obliterate. ;)
Edited ...
Like
Alexn 12.25
...
· 
·  2 likes
·  Share link
Jon Rista:
Chris:
Thanks for this post...been having similar issues so I'll give some of the recommended changes in here a try
Screenshot 2024-10-20 at 3.32.03 PM.png

My strongly held opinion here is, this is WAY, WAY too much noise reduction. 

I mentioned earlier in the thread the concept of "obliteration"? This right here is noise obliteration. In fact, its OBLITERATION! 

I know that there are shifting aesthetic preferences in any endeavor like astrophotography. That said, I think there IS a proper way to handle any signal. I think there are often misconceptions that noise and signal are...well, lacking a better term, "orthogonal" to each other (orthogonality between distinct signals is a different thing, but I'm specifically talking about a given signal, and its noise here). By this, I mean, I think people assume noise can be eliminated from a signal, without unduly affecting the signal.

The reality is that the signal IS NOISY. The noise is not orthogonal to the signal...it is an intrinsic aspect of uncertainty IN the signal. An ASPECT OF the signal. It cannot be "eliminated"...or as I've been putting it, OBLITERATED. Not, at least, without having a detrimental effect on the signal itself. 

The mottling, orange peel, otherwise undesired effect in this image here, is the result of attempting to obliterate, or completely remove, noise from the signal.

Approach the problem differently. REDUCE the noise. Don't obliterate it. The noise is an intrinsic part of, an inherent aspect of, the signal. Obliteration decimates the quality of the signal, which is not really a desirable outcome. Instead of obliteration, aim to mitigate the impact of noise, while at the same time trying to affect the quality of the signal as minimally as possible. THE SIGNAL IS NOISY! That's a fact. You can manage the noise, adjust its nature, so that it becomes less apparent, while having a minimal effect on the information you can SEE and OBSERVE within the signal. 

I never try to eliminate the noise, or the "graininess" of my images. The noise is intrinsic, inherent, and a fundamental part of the image. Instead, I try to minimize its visual impact. This comes via reduction. You can think of noise as an amplitude...one pixel might be brighter than it "should" be, the next darker than it "should" be, the next after that pretty close to what it "should" be. We don't actually, and simply cannot actually, KNOW what each pixel really "should" be!! This unknowability, or what we call "uncertainty" in the signal, IS noise! By changing the noise, we are affecting the accuracy of our signals...which are measurements of some amount of light (the signal) at some point in the sky. Even if we just reduce the amplitude of the noise, we are still in fact affecting the signal...you cannot affect noise without affecting signal. You always affect both at the same time. But, reducing the amplitude can mitigate the impact noise has on our ability to see the pretty picture within.

Eliminating (i.e. obliterating) the noise is effectively flattening any meaningful differences between adjacent pixels, and that WILL, ALWAYS, have an effect on the signal...usually a detrimental one. To preserve the signal, you have to reduce the noise to a degree, but not obliterate. 

Say we have something like this
---
|
 |  
 |       ---
| |
| |
|  ---  |
|   |    |
| | |
Obliteration would likely produce something like this:
--- 
| --- ---
 |  |  |
 |   |   |

When in fact, what we really wanted was something more like this:
--- 
 |       ---
| --- |
 |  |   |
 |   |    |
| | |
Reduction. Manage the noise, finesse it, to bring out the pretty picture without destroying it. Don't obliterate. ;)

100% Agree - and this is a VERY common thing these days with a number of very powerful noise reduction tools - and a lot of people wanting to pull the full extent of nebulosity out of a data set that does not have the integration time to support such agressive processing - they will lean on powerful noise reduction that tends to cause these 'smooth plastic' looking areas of image, where data and tonality in the target have been completely smeared smooth, then stars get  put back on top of it, adding sharp contrast to the very smooth, detail-less image.
Like
Herbert_West 4.72
...
· 
·  1 like
·  Share link
Too much high-frequency noise reduction will very often result in blotches. Do not remove all the noise, and be especially careful to leave some amount of high-frequency noise- the finest graininess. Or blotches will rear their ugly head. Simple as that.

Why does this happen? What high-frequency noise does is dithering. Not the slight frame shift during data acquisition, but the digital signal processing kind of dithering, essentially. It hides quantization errors that manifest as discrete changes in color and/or brightness- blotches. This article does a good job of explaining why and how it works: https://mediahygiene.com/what-is-image-dithering/
Edited ...
Like
CSChurch14 0.90
...
· 
·  Share link
Thanks for these comments all…very informative.  I’ve been realizing along my Astro-photo journey that the best pictures tend to have some of that intrinsic noise mentioned above. 

I need to pay more attention to the settings in BlurX and other tools when applied…great comments and thanks again!
Like
Herbert_West 4.72
...
· 
·  1 like
·  Share link
One more thing - don't do noise reduction in one go. It's far, far better to gently grind it down a few times in both the linear and the non-linear stage. We're spoiled for choices here, there are so many great tools now.
Like
 
Register or login to create to post a reply.