Well I have to disagree somewhat with your statement about throwing away the full potential. The OSSC provides three very important improvements over generic scan converters (albeit I only one device I can compare it to): 1) Almost zero lag, 2) no blurry motion artifacts, and 3) scanlines.
I made your recommended adv. timing changes for PAL Hi Res and used the Workbench checkerboard pattern to assist. You are right, I did see some improvement in the pixel image quality, most noticeable with characters “M” and “W” but the change was not that dramatic in my case and it came with a downside: The image is shifted too far to the right and is being cut-off. The setting responsible for this is H.samplerate. I can compensate 100% for this by adjusting the Workbench overscan preferences. Unfortunately, this will not help the default image position in games and demos.
The next biggest setting that helped bring out a sharper picture was the sampling phase (especially noticeable with that checkerboard pattern).
I suspect the variation in these settings being less than optimal will depend on the display type. The closer I get to the display’s native resolution, the more pixel-perfect it will be. My Vizio is 4k and 16:9. A 2k monitor at 16:10 will give different results I’m sure.
The other settings did not make a visual difference at all, at least none that I could detect. But, something may have had an impact on the random loss of sync. I have not seen it happen since making the changes.
I did have to return H.samplerate back to what it was. I was even thinking I could reduce it more and that could move the image more to the left. As you are probably aware, the Amiga screenmodes are shifted slightly to the right compared to default monitor settings back in the day. It appears to still hold true with modern displays.