Help figuring out exposure times from dark sites [Deep Sky] Acquisition techniques · Jaymz Bondurant · ... · 29 · 684 · 0

AstroJaymz 0.00
Topic starter
...
· 
·  Share link
Scott Badger:
I use a C9.25 Edge at f10 in Bortle 3/2 skies. So, not f4, but at 10 min for R, G, & B subs, and 250 sec for Lums, I saturate a handful of pixels at most. Could be I'm wrong, but wouldn't light polluted skies add to the pixel values of a star and saturation would be more likely, all else equal (exposure, gain, etc.), than at a dark sky site. Dark skies provide more contrast between background and stars/target, but how would the stars themselves be brighter? Where saturation can become a problem is at longer exposure times intended to swamp the read noise under dark skies. Anyhow, here's a presentation by Dr Robin Glover (sharpcap) and at about 50 min he shows an exposure chart relative to focal ratio and Bortle value. He doesn't show Bortle 2 (just 3 and 1), and your camera may have less read noise than the 2.5e he charts, but extrapolating I would guess the optimum exposure would be around 60s for a mono sensor and 180s for OSC. https://www.youtube.com/watch?v=3RH93UvP358

As has been mentioned, posting an image would be helpful.

Cheers,
Scott

You are correct that the stars aren't any brighter under dark skies. The issue that dark skies presents is the need for longer exposures to swamp the read noise. It's these longer exposures that are causing the issue of saturated pixels. I'm basically trying to follow two "rules". One is to swamp the read noise. The other is to not oversaturate too many pixels. Swamping the read noise requires 3-4 minute exposures. But I'm oversaturating pixels at 30 seconds. Hence the dilemma. I can't follow one "rule" without breaking the other. So, I'm trying to find a compromise between the two. I'll try to take some test shots next time I'm set up and.....HOLD THAT THOUGHT!!!!!

I just had an idea! As stated, I'm using N.I.N.A. for acquisition and paying attention to the Statistics tab next to the image. NINA applies an autostretch to every sub. I'm wondering if maybe, for some odd reason, NINA is displaying statistics for the STRETCHED data?!? It doesn't make any sense to do this. But it would explain why the numbers are much higher than they should be. So now I need to investigate that. If that's the case, then I don't actually have any problems at all!
Like
AstroJaymz 0.00
Topic starter
...
· 
·  Share link
Mina B.:
Jaymz Bondurant:
They’re completely blown out. No color. Just bright white. I’m saturating thousands and thousands of pixels. Even at 0 gain, I’m limited to 60 seconds. Sometimes 30 or less depending on the star field.

That doesn't sound right. I am using the 183MC Pro at unity gain (111) and it has a laughably low Full well at this gain, and I can still do 120s or even 180 seconds if the sky is dark enough fairly well. I suggest stretching stars and object seperately, it helps tremendously, and even before I started doing this, my star colors were fine and only the brightest cores got blown out a little. With a 2600MC Pro and 0 gain, you shouldn't run into any of those issues.

I do stretch them separately. I've got that under control (albeit by using shorter than recommended exposures). But now I'm wondering if maybe NINA, which autostretches the display image, is using statistics from the stretched image. That would most definitely saturate many more pixels. Maybe my raw images aren't actually saturated. I need to look further into that.
Like
AstroJaymz 0.00
Topic starter
...
· 
·  Share link
Well, I've confirmed that NINA is measuring the raw data. So, that's no help. But I did recently learn that there's a second tab in NINA that shows saturated pixels. These are the Statistics tab and the Star Detection Results tab. Strangely, while the Statistics tab shows over a thousand saturated pixels, the Star Detection Results tab shows less than a hundred. So, now I'm not even sure how to accurately measure it.
Like
ScottBadger 7.63
...
· 
·  1 like
·  Share link
Another way to see the saturated pixels would be to first extract a lightness image from your rgb, convert the lightness to color (Image/Color Spaces/Convert to RGB color) and then "paint" the saturated pixels red with Pixel Math. Adam Block uses this technique to show clipped pixels in a narrowband integration in this video, https://www.youtube.com/watch?v=M4Bie7NOXbg.  Just use 1 instead of 0 in the Pixel Math expression to highlight saturated pixels instead of clipped.

Cheers,
Scott
Like
AstroJaymz 0.00
Topic starter
...
· 
·  Share link
Scott Badger:
Another way to see the saturated pixels would be to first extract a lightness image from your rgb, convert the lightness to color (Image/Color Spaces/Convert to RGB color) and then "paint" the saturated pixels red with Pixel Math. Adam Block uses this technique to show clipped pixels in a narrowband integration in this video, https://www.youtube.com/watch?v=M4Bie7NOXbg.  Just use 1 instead of 0 in the Pixel Math expression to highlight saturated pixels instead of clipped.

Cheers,
Scott

I’ve seen that. But, to my knowledge, it merely “highlights” those pixels. Is there any way to see how many of those pixels there are?
Like
 
Register or login to create to post a reply.