WBPP on raw vs Calibrated files Pleiades Astrophoto PixInsight · Mark McComiskey · ... · 11 · 682 · 0

McComiskey 3.01
...
· 
·  Share link
I am working with an unusually large dataset. So large that I cannot run the WBPP from calibration through integration in one go, as the output surpasses the capacity of my 4TB SSDs. So my solution was to run the calibration first, and ave the files on one SSD and then use WBPP a second time to do registration and integration onto a second SSD.

This results in warnings about doing measurements using non raw, uncalibrated files. 

So my question is: will the quality of the weightings analysis that pixinsight does be significantly affected by the fact that I am using calibrated files?
Like
andreatax 9.89
...
· 
·  1 like
·  Share link
In general I'd say no but I don't let PI decide what the weights are nor I use WBPP. YMMV.
Like
darkmattersastro 11.95
...
· 
·  1 like
·  Share link
It’ll be just fine. I’ve stacked data both ways and the result was the same. You can always test a small subset and verify.
Like
ChuckNovice 8.21
...
· 
·  Share link
I don't use WBPP either.

During my manual process, I do the weighting AFTER calibration/cosmetic correction. If you tell me that WBPP does it before then I am not sure why it would do it at this stage and I don't think it would matter either.

Side note: If your dataset is THAT large you'll run into out of memory issues during stacking and you'll need to first stack groups of X images, then stack the master of each group. With the massive resolution of the 6200MM, I have to resort to that technique when stacking >600-800 images on a 128gb memory computer. Not sure WBPP does that.
Edited ...
Like
TonyB53 0.00
...
· 
·  Share link
Mark McComiskey:
I am working with an unusually large dataset. So large that I cannot run the WBPP from calibration through integration in one go, as the output surpasses the capacity of my 4TB SSDs. So my solution was to run the calibration first, and ave the files on one SSD and then use WBPP a second time to do registration and integration onto a second SSD.

This results in warnings about doing measurements using non raw, uncalibrated files. 

So my question is: will the quality of the weightings analysis that pixinsight does be significantly affected by the fact that I am using calibrated files?

Why not break your data set into thirds and run WBPP with smaller data sets then integrate the three resulting images?
Like
Mau_Bard 4.06
...
· 
·  Share link
If I well understand, WBPP (and the same does SubframeSelector when used standalone) issues a warning in your second run, because WBPP thinks you have not calibrated them. But this is not true, because you calibrated them in your first run, therefore the result should be correct, despite the warning.

Also good Tony's idea to split your huge lot in more sections, and run WBPP individually on each of them without integration. Then run integration separately.

Miguel's idea of stacking several integrated submasters, each coming out of a subset integration, poses the following theoretical problem: "Is integration of integrated sub-masters equivalent to a single-run integrated master?" I would say that, though the general answer is no, it might work well with a large set of good quality sub-exposures. As far as I understand the math the final signal-to-noise ratio should be equivalent, while the outlayer rejection would be less effective.
Probably this could be the easiest solution to implement, after testing.
I would be very interested in your test results!

Ciao, Mau
Like
swalkenshaw 0.00
...
· 
·  Share link
I would manually do all the steps.  I was so glad I found https://www.cloudynights.com/topic/616865-post-your-pixinsight-processing-flow/page-7, about half way down the page.  It is the workflow of Christopher Foster.  It taught me almost everything I know about what WBPP does for me and how to do it manually.  (FYI, if it's not fixed, in the Subframe Selector section, add a closing parenthesis )
Like
McComiskey 3.01
Topic starter
...
· 
·  Share link
Thanks for all the replies.

A couple of things:

First, the warning is not that I have not calibrated the files.  This is not a warning in the diagnostics section of WBPP.  Rather it is a warning in the Process Console while WBPP is running that says that it sees I am using non-raw files and that may create issues.

Second, there are a lot of ways to split up the work; that is not much of an issue.  I could, for example, just do each channel individually, using a single sub as a registration reference.  

My question was more to understand why dark subtraction and flat division would cause issues for PI's measurement and weighting system and I was wondering if anyone had knowledge or experience of this.  I would rather not inadvertently introduce issues into the data.

I asked this on the PI forum, but 0 responses in a week.  AstroBin is so much more active!
Like
andreatax 9.89
...
· 
·  Share link
There are no issues I am aware of in over 12 years of using PI. AFAIK, the issue is there if you have de-mosaicking in the processing pipeline. That warning would come during image integration if you use PI internal weight metering (but so do others, depending on what is inside the FITS/XISF headers).
Like
Semper_Iuvenis 3.10
...
· 
·  Share link
Just don't use the script.  Nothing magical about it.   Memory constraints usually happens during the image integration process.   Running the process manually you can change the buffer sizes, and counts, to break up the pixel row memory usage.
Like
McComiskey 3.01
Topic starter
...
· 
·  Share link
Really?  Where are the settings to alter buffer sizes and counts?  That wasn’t what I was asking about but could be very useful.
Like
Semper_Iuvenis 3.10
...
· 
·  Share link
In the image integration process.   As for subframe selection assignment of weighting, that's up to YOU and the weighting formula you use.
Like
 
Register or login to create to post a reply.