I’ll try to answer, however this is all at the border of my understanding of analog and digital video (or the OSSC itself for that matter), so hopefully someone will correct me if I’m totally off-base with some of this.
Does the OSSC first samples (ADC) then scales (x2, x3, x4, x5) or vise-versa?
Yes, sample then scale.
Is the LPF analog?
Yes to my understanding, both video and sync LPF are analog (prior to ADC).
Analog -> LPF -> Sampling -> Scaling -> Post Processing
Yes, I believe this is correct (possibly some post-processing is done prior to scaling, others after, not sure).
The H.samplerate is effectively the “Total Pixels” while the Active Pixels is the visible resolution, everything else is for back/front porches and sync, just like in Nvidia Custom Resolution correct?
Basically, is the OSSC H.samplerate is between the red lines?
Yes and yes.
Does the process of sampling (digitization) include the analog sync signal, or the analog sync is used as a trigger only for the digital sampling process?
I ask because the Sync options are in uS instead of pixels.
Well more like both, hsync is used to generate pixelclock, but afaik the sync is still separated, sampled and recombined to be output as the digital clock stream. At least I know that digital RGB still consists of three channels for image information (pixel RGB values), and one channel for pixelclock/sync. If the digital sync signal is further processed, it’s very possible it is “regenerated” rather than sampled and piped through, but I do not know.
Sync options relate to the incoming analog sync, so I guess it’s natural it’s given as seconds.
What do incompatible TVs don’t like then, the difference between active vs total pixels counts, or are the TVs expect exact timings according to spec “to-the-last-pixel” to show picture? What is the general tolerance of TVs to off-spec timings?
Well, that’s the million dollar question… 😉 I don’t know if an answer can even be given, except that “it’s a bit of everything”. The tolerance for pixelclocks, line count and refresh all interact, and when they separately and combined are not in line with standardized specs, it’s very hard to predict compatibility. And the only one of those factors the OSSC can influence is the pixel clock (by varying samplerate).
If so many TV do not support the abnormal timings, why not use the HDTV EDID timings and insert “black pixels” as part of the active pixels to compensate and essentially standardize the OSSC timing?
OSSC has only limited line buffering capabilities due to memory constraints, it cannot hold more than a handful of lines at a time and thus compensations are very limited or even non-existant. Standardization would inevitably include frame conversion, and from what I’ve read from the likes of Marqs, a dedicated frame buffer is really the only way to deal with this.