Raising the refresh raises the frequencies. Since this is an all-analog technology, you will typically get a lower quality picture the higher your frequencies. Historically I've been able to see the picture quality decrease with each step from 60i, 60p, 70, 75, 80 etc... Hz for a given resolution.
I never heard that higher frequency = worse picture quality. Do you have any write ups on this? I can't notice a decrease from 70 to 85Hz, but I do notice flicker at anything under 85Hz. I don't care about frames in games...flicker though, I am very subject too. I can't even take it when a co-worker in my peripheral vision has it.
This has been the cause of many "Holy CRA*! this is SO MUCH BETTER!" once I adjusted their screens