Jared Willson:I'm not saying that people who are guiding are wrong because that's the vast majority of astrophotographers out there. In my case, I have been forced to shoot with a mount that's not capable of guiding with any accuracy and for a long time, couldn't replace it. I shoot unguided out of necessity and have learned how to make it work. I see nothing wrong with exploring alternatives.
What mount are you using that can handle 30s sub exposures unguided at 400mm focal length without any issues, but can not be effectively guided? Not trying to cast aspersions, but I am surprised by this particular combination. I would think that most mounts capable of solid tracking for thirty seconds at a time, even at a relatively short 400mm focal length, would also be capable of decent guiding performance.
I'm not really a proponent of the, "Guiding Needs to Die" approach since I think guiding is actually a pretty elegant solution to the real problem of wanting to make tracking corrections in the middle of an exposure. But I certainly agree that for lots of imagers it's not a requirement, and it may soon go away entirely if read noise drops much from its current levels. I would love to get to the point that we are all just live stacking a few million 0.1s images! For me, it wouldn't be about getting rid of guiding--it would be about incorporating "lucky imaging" into even faint deep sky subjects and improving resolution. If we could spin up an AWS server with the appropriate level of GPU power to do real-time image calibration, complete with error rejection, such that my "computer" could keep up with 10 frames per second and just give me a final stack, I think that would be really great. Then I'd need a really good internet connection to my telescope, but that's about it. No home NAS system, no terabyte size directories of raw files, and all the resolution planetary imagers are used to but applied to galaxies, clusters, and nebulae. We're not quite there yet. I know my observatory couldn't handle transferring 10 full resolution raw files per second to an S3 storage bucket, so right now I would have to pay for the latest and greatest NVidia has to offer stuffed into a desktop at the observatory, but there are already Python repositories out there that can keep up with live stacking 61 megapixel files at the rate of one every second or two as long as you can bring a high end graphics card to bear. Let read noise drop a bit more, and guiding will lose relevance.
It's an old Meade LXD-75, remember those? It is one step above the 55 because it has bearings! I won't say that its 30 sec tracking is solid, just something I can live with. On average, I have to throw out about 20% of the subs. The thing that saves me is my polar alignment is excellent. I am pretty much just dealing with RA.