No, GeForce Cards Are Not Suddenly "Playing Nice" with FreeSync Monitors

If triple buffering is applied by windows 10 is there really much point in having the ability to use GSync/FreeSync in non-exclusive full screen? Do the drivers circumvent Windows when GSync/FreeSync is working correctly?

I guess I will have to check my display's FPS counter next time I'm playing a game in windowed mode...
If you can maintain your screen's maximum framerate at all times, then not really. If you can't, then yes.
 
G-Sync only works in certain FPS ranges as well.

You are probably confusing refresh rate with low framerate compensation. There are freesync monitors that have this.

See here :
https://www.amd.com/Documents/freesync-lfc.pdf

From what I can find, G-sync works from 36 fps and up. The chart of freesync monitors showed some only used freesync from 40 fps to 60fps.. kinda narrow range. Now, that may be the range where it is most helpful, meaning there is still worthwhile and measurable benefit. The point the quoted article was making, was that freesync varies by significant amounts from display to display. nVidia requires the manufacturer meet various specifications before they can implement G-sync. This is potentially a small part of the increased cost, but also serves as a guarantee of performance for the consumer (at a higher price).

Only the end user can decide if the added cost is worth those benefits. Either way, they are notable differences everyone should know.

As far as the LFC, to me that was a required improvement for freesync to be useful, and makes the behavior of freesync mimic how G-sync handles low fps.
 
You were saying they don't sell monitors, as though Nvidia doesn't have a horse in the race. As for it's profitability, my point was I suspect it's only profitable if they ensure Nvidia cards don't support freesync. If they did, I think the Gsync market would start evaporating, since while it's arguably better, it's not worth the price premium to most buyers and would likely stop turning a profit.

As for blame, look at it however you want. Consumers are easily manipulated. Always have been, always will be. Individual ones can be smart, as a whole, they're sheep. Many companies take advantage of that. That's about all there is to it.


Don't agree, things aren't as simple as so many want to believe. It seems that ever time someone wants to make this value comparison they want to do it in a vacuum devoid of the reality most users are in when actually making a purchase decision. How many PC gamers decide all at once to buy new monitors and video cards etc, all at one time? For myself, it's rare. Usually it's one or the other, or something else. I built an entire system returning from Iraq in 2008, put a GT880 I think it was in it and a 1080P monitor. Later I wanted a better card and was looking at early 2K and the whispers of 4K and the costs and decided I would make a conscious decision to stay with 1080P for awhile long so I bought a GTX-6400 knowing it would be a single card solution, not too power hungry, that would drive a pair of 1080P displays at decent frame rates. G-Sync was too expensive to consider at the time and I don't think Freesync was even in play yet.

Then I decided that I did want a better display, 4K was available, GTX 10 series cards were new, and AMD did not have a single card solution that would do 2K right. I bought a GTX 1070 first, followed a few months after by a Acer X-34 and a few after that I got a Dell Gaming DG2417 also w/G-Sync.

What all this illustrates is that once a person is "on the path" it's not so easy to change gears in order to save money because it requires replacement of multiple components. And if I'm not going to save any money why sacrifice even "arguably better" performance. That's the real world at work. Now I have $1400 in a pair of displays that have G-Sync, you think I won't be buying NVidia still for awhile? Do you think I want NVidia to drop G-Sync from their new cards? Fuck no.

Demand drives markets. Wish in one hand, shit in the other ..... you will only find one get full.
 
Don't agree, things aren't as simple as so many want to believe. It seems that ever time someone wants to make this value comparison they want to do it in a vacuum devoid of the reality most users are in when actually making a purchase decision. How many PC gamers decide all at once to buy new monitors and video cards etc, all at one time? For myself, it's rare. Usually it's one or the other, or something else. I built an entire system returning from Iraq in 2008, put a GT880 I think it was in it and a 1080P monitor. Later I wanted a better card and was looking at early 2K and the whispers of 4K and the costs and decided I would make a conscious decision to stay with 1080P for awhile long so I bought a GTX-6400 knowing it would be a single card solution, not too power hungry, that would drive a pair of 1080P displays at decent frame rates. G-Sync was too expensive to consider at the time and I don't think Freesync was even in play yet.

Then I decided that I did want a better display, 4K was available, GTX 10 series cards were new, and AMD did not have a single card solution that would do 2K right. I bought a GTX 1070 first, followed a few months after by a Acer X-34 and a few after that I got a Dell Gaming DG2417 also w/G-Sync.

What all this illustrates is that once a person is "on the path" it's not so easy to change gears in order to save money because it requires replacement of multiple components. And if I'm not going to save any money why sacrifice even "arguably better" performance. That's the real world at work. Now I have $1400 in a pair of displays that have G-Sync, you think I won't be buying NVidia still for awhile? Do you think I want NVidia to drop G-Sync from their new cards? Fuck no.

Demand drives markets. Wish in one hand, shit in the other ..... you will only find one get full.
I think you're comparing apples to oranges here. My point is, fine, you're a typical buyer on an Nvidia path. Here's the choice people would have if Nvidia supported Freesync:

1. Monitor with all the specs you want + Freesync
2. Monitor with all the same specs you want + Gsync + an extra $200 pricetag

The majority of consumers will go with Freesync being good enough. That would likely make the Gsync monitors flounder and become unprofitable over time. So by Nvidia ensuring it DOESN'T support it, they take away option #1 from you, ensuring that if you want to game on an Nvidia card, you have to pay that extra $200 on a monitor if you want dynamic frame syncing. I'm not entirely certain what you're arguing here, but you seem to be implying that wouldn't be the case.

And nowhere was I implying Nvidia should DROP Gsync support, just that it may not be economically viable if they also supported the Freesync standard, since it would be a conflict of interest for them.

And I still stand by my statement that the majority of consumers are easily manipulated. Our economy depends on it.
 
I think you're comparing apples to oranges here. My point is, fine, you're a typical buyer on an Nvidia path. Here's the choice people would have if Nvidia supported Freesync:

1. Monitor with all the specs you want + Freesync
2. Monitor with all the same specs you want + Gsync + an extra $200 pricetag

The majority of consumers will go with Freesync being good enough. That would likely make the Gsync monitors flounder and become unprofitable over time. So by Nvidia ensuring it DOESN'T support it, they take away option #1 from you, ensuring that if you want to game on an Nvidia card, you have to pay that extra $200 on a monitor if you want dynamic frame syncing. I'm not entirely certain what you're arguing here, but you seem to be implying that wouldn't be the case.

And nowhere was I implying Nvidia should DROP Gsync support, just that it may not be economically viable if they also supported the Freesync standard, since it would be a conflict of interest for them.

And I still stand by my statement that the majority of consumers are easily manipulated. Our economy depends on it.


I think you don't see part of the equation. From the display manufacturer's perspective, G-Sync may be an additional licensing cost that drives up the price of displays, but they also represent for them, the upper tier for gamers, the peak, the point at which customers with cash will pay more for the best. They wouldn't be interested in a G-Sync monitor if they weren't paying more for G-Sync capable NVidia cards, so they will be willing to pay more for their monitors as well. This is where they can pad profits. This is where they have a feature that will set their monitors apart from others and it doesn't require a gimmick to sell their products over the herd's products.

Let's jump ahead on your scenario, NVidia begines supporting Freesync II and they drop G-Sync.

1. People like me are fucking pissed, I have a bigger investment in monitors than in the rest of my computer system and they will not work with Freesync II.
2. All gaming monitors are now on the same standard, it's harder to differentiate or demand premium prices so how do manufacturers differentiate their products from others when they all do the same thing?
3. Consequently, profit margins narrows as manufacturers look to cost cutting measures. Premium lines will exist and we will hopefully be paying more for quality.
4. Lacking ways to differentiate their products manufacturers will look for other ways to define "premium lines"

Do you think it was just a freak of nature that for a couple of years there was no Freesync capable monitor that had the same type of IPS panal with resolutions and refresh rates of the 34" Acer and ASUS top displays? No other manufacturer bought those displays to offer them with Freesync, why?

Why do you think it's NVidia's fault, and not the display manufacturers desires to maintain this situation?

It's symbiotic, they made a deal, together.
 
Last edited:
From what I can find, G-sync works from 36 fps and up. The chart of freesync monitors showed some only used freesync from 40 fps to 60fps.. kinda narrow range. Now, that may be the range where it is most helpful, meaning there is still worthwhile and measurable benefit. The point the quoted article was making, was that freesync varies by significant amounts from display to display. nVidia requires the manufacturer meet various specifications before they can implement G-sync. This is potentially a small part of the increased cost, but also serves as a guarantee of performance for the consumer (at a higher price).

Only the end user can decide if the added cost is worth those benefits. Either way, they are notable differences everyone should know.

As far as the LFC, to me that was a required improvement for freesync to be useful, and makes the behavior of freesync mimic how G-sync handles low fps.

LFC is exactly what G-Sync uses.

G-Sync isn't a specification, it is an entirely proprietary scaler module that replaces a good portion of the monitor's electronics with NVIDIA's own system. The cost is in what NVIDIA charges monitor manufacturers for that scaler module. That is why all G-Sync monitors have the same minimum refresh rate; because they all use the same scaler.
 
...Let's jump ahead on your scenario, NVidia begins supporting Freesync II and they drop G-Sync...

There is also support in the GPU for G-sync and in AMD GPU's for freesync. Just another reason that scenario isn't ever going to happen, as much as some might wish it.
 
Back
Top