Modder implements DLSS 3 Frame Gen, DLSS 2 in Jedi Survivor

GoodBoy

2[H]4U
Joined
Nov 29, 2004
Messages
2,810
https://wccftech.com/star-wars-jedi-survivor-mod-shows-dlss-superiority-over-fsr-2/

"After the DLSS 3 (Frame Generation) Star Wars Jedi: Survivor mod that greatly boosts frame rate when using GeForce RTX 40 Series graphics card, modder PureDark is now working on implementing DLSS 2 (Super Resolution) in the game.

Since Jedi Survivor is an AMD-partnered title, it only features native support for AMD's FidelityFX Super Resolution 2 upscaling technique. However, the game's implementation of FSR 2 is quite bad and leads to blurry textures and degraded image quality. While a new mod helps a bit with that, GeForce RTX owners have been waiting for PureDark to deliver a proper NVIDIA DLSS 2 implementation.
PureDark recently posted an image comparison of DLSS 2 and FSR 2 in his work-in-progress Star Wars Jedi: Survivor mod, showing that the modded DLSS 2 version has fewer artifacts and is better aliased than the native FSR 2 implementation. However, he also noted that there wouldn't be any performance improvements moving from FSR 2 to DLSS 2.

PureDark hasn't released this DLSS Star Wars Jedi: Survivor mod yet..."



Set the Youtube to stream at max resolution and maximize it get the best impression of the improvements.

If a modder can put this in, there's no excuse that the developer "couldn't" or "couldn't afford to". They were paid not to.

This one is worth a read as well: https://wccftech.com/star-wars-jedi-survivor-mod-disables-resolution-scaling/
"The new mod, which can be downloaded right now from Nexus Mods, fixes the blurriness which has been discovered to be caused by TAA without disabling it while also disabling broken hidden resolution scaling. According to the modder, hidden resolution scaling in the game is tied to the graphics preset, and only the Epic preset doesn't feature any scaling. Also, there's a bug that causes 50% resolution scaling if FSR 2.0 is disabled, forcing every user to play it on to prevent the Star Wars Jedi: Survivor's visuals from looking even worse."

That mod is here: https://www.nexusmods.com/starwarsjedisurvivor/mods/74?tab=description
From the sounds of this one, it fixes the default 50% resolution scaling that is a hidden setting (why?).

This one should benefit both Nvidia and AMD users, so check it out if you own Jedi Survivor.

EDIT: I've taken the above Resolution scaling mod, and combined it with the Raytracing fixes mod, and created a single mod for Jedi Survivor. Place the attached file in C:\Program Files\EA Games\Jedi Survivor\SwGame\Content\Paks or wherever you installed the game to.

These are the changes:
[SystemSettings]
r.DefaultFeature.AntiAliasing=2
r.PostProcessAAQuality=4
r.TemporalAA.Upsampling=1
r.TemporalAA.Algorithm=0
r.ScreenPercentage=100.000000
r.TemporalAACurrentFrameWeight=0.2
r.TemporalAASamples=4
r.Tonemapper.Sharpen=1.0

[/Script/Engine.RendererSettings]
r.RayTracing.Geometry.Landscape=0
r.DefaultFeature.AntiAliasing=2
r.PostProcessAAQuality=4
r.TemporalAA.Upsampling=1
r.TemporalAA.Algorithm=0
r.ScreenPercentage=100.000000
r.TemporalAACurrentFrameWeight=0.2
r.TemporalAASamples=4
r.Tonemapper.Sharpen=1.0

A few of the settings I couldn't find a clear answer as to the section they belong under, so placed them in both. It's working well, I disabled FSR, enabled Raytracing, and played thru the final battle, then thru the cutscene up to the credits. Previously with Raytracing on and FSR off, I would always crash during this cutscene while Merrin is talking to Catia, or just very shortly after when Cal is leaning over Bode. Couldn't finish the game...

It's clear that the visual quality is improved. I didn't turn on the fps counter but it was playing smoothly. Going to do that after work.

Here is a screenshot from playing with this mod installed:
STAR WARS Jedi  Survivor Screenshot 2023.06.29 - 00.02.14.40.jpg

This was taken as an Alt+F1 screenshot with GeForce Experience, it was not from Photo Mode.

Edit2: Can't figure out how to attach a file, trying again. Ok that worked. Download the file, rename it back to .pak, take off the .txt.

Edit3: So this 'Raytracing fix' did help with my crashes during cutscenes. Previously I could never get thru the cutscene after the Bode fight, and it would crash 100% of the time while Merrin was talking to Catia. With the mod I was able to finish my Journey+ (second playthrough) without any issues, and make it thru the Endgame cutscenes. The game is definitely faster, and better looking too. But after that, I had a crash on Jedha, and that one is somehow Raytracing related. There was one setting in the file that references Jedha, might need to try toggling that.

This section:
[LandscapeRayTracing]
AllowedLODScalingPaths=/Jedha/

The line probably needs deleted. I will try to get to that tomorrow.

I tried removing the above line. It made no difference.
But, using PureDark's mod, plus this RT crashing fix: https://www.nexusmods.com/starwarsjedisurvivor/mods/182?tab=files
and the game finally looks as good as it can, I get 45 to 70 FPS, stable, no crashes. I played for probably 3 hours running around Jedha the whole time. No issues.
The attached fix also contains the above fix.

I bet the PureDark mod would be beneficial even for AMD users.
 

Attachments

  • pakchunk99-Mods_RT+ResolutionScalingFixes.pak.txt
    184.4 KB · Views: 0
Last edited:
Would be more useful if he did DLSS2. (though that may not be possible.

DLSS3 is a big meh. While I have been impressed at how well the tech generates frames, without artefacting or any kind of stutter or anything like that, fake added frames with the same input latency as a lower framerate doesn't really do anyone any good. its the input lag that really makes lower framerate unplayable. Inserting added fake frames with that same input lag really does nothing of value.
 
Would be more useful if he did DLSS2. (though that may not be possible.

DLSS3 is a big meh. While I have been impressed at how well the tech generates frames, without artefacting or any kind of stutter or anything like that, fake added frames with the same input latency as a lower framerate doesn't really do anyone any good. its the input lag that really makes lower framerate unplayable. Inserting added fake frames with that same input lag really does nothing of value.
While I agree it can fix some edge cases for monitor refresh where you are then dealing with flicker and tearing, G/FreeSync can generally manage that but in its absence fake frames* are still better than VSync+Motion Blur.

*All frames are fake, rendered from data, this just renders them from different data
 
Would be more useful if he did DLSS2. (though that may not be possible...
He is implementing DLSS 2. The only DLSS 3 feature he has enabled is the Frame Generation.

And thinking about it, this is what consoles really need in the next generation. Nvidia GPU's + FrameGen. Maybe then they wouldn't have to render at 840p and upscale to 4k, resulting in the blurry mess.
 
Note that the "ray tracing mod" OP is using disables shadows affecting the lighting on the terrain, basically defeating the whole point of using ray traced shadows.
Would be more useful if he did DLSS2. (though that may not be possible.

DLSS3 is a big meh. While I have been impressed at how well the tech generates frames, without artefacting or any kind of stutter or anything like that, fake added frames with the same input latency as a lower framerate doesn't really do anyone any good. its the input lag that really makes lower framerate unplayable. Inserting added fake frames with that same input lag really does nothing of value.
DLSS3 = upscaling + frame gen. You can use upscaling with DLSS3 without using frame gen.
 
And thinking about it, this is what consoles really need in the next generation. Nvidia GPU's + FrameGen. Maybe then they wouldn't have to render at 840p and upscale to 4k, resulting in the blurry mess.
Most current gen console games are around 4k/30 native or ~1440p/60 upscaled to 4k with either FSR or some other upscaling method, and often times with DRS enabled to scale resolution as needed to hit their performance target. Although a few recent titles seem to be demanding enough to push resolutions lower to 1080p or so for 60 FPS targets.

Ideally FSR (or any other open-source upscaler) is improved to a point where it's more comparable to DLSS so it's available to everyone on all platforms. Because from what I've read and observed in current and previous gen consoles, Nvidia doesn't play very well in terms of setting margins on their chips and also committing to future support and compatibility when it comes time for a hardware refresh or successor. Like right now it seems like Nintendo may have to go with another supplier for the next Switch, as Nvidia doesn't seem too interested in iterating on the ~8 year old Tegra chip at this point given they haven't even iterated on the mostly successful Shield TV using the same chip since 2015 (with a minor improvement to it a few years ago to integrate AI upscaling to it).
 
Most current gen console games are around 4k/30 native or ~1440p/60 upscaled to 4k with either FSR or some other upscaling method, and often times with DRS enabled to scale resolution as needed to hit their performance target. Although a few recent titles seem to be demanding enough to push resolutions lower to 1080p or so for 60 FPS targets.

Ideally FSR (or any other open-source upscaler) is improved to a point where it's more comparable to DLSS so it's available to everyone on all platforms. Because from what I've read and observed in current and previous gen consoles, Nvidia doesn't play very well in terms of setting margins on their chips and also committing to future support and compatibility when it comes time for a hardware refresh or successor. Like right now it seems like Nintendo may have to go with another supplier for the next Switch, as Nvidia doesn't seem too interested in iterating on the ~8 year old Tegra chip at this point given they haven't even iterated on the mostly successful Shield TV using the same chip since 2015 (with a minor improvement to it a few years ago to integrate AI upscaling to it).
NVidia has been iterating and updating the platform every generation.
Jetson, Xavier, Orin, …
The switch was a custom variant of the Jetson platform, they have released 2 new iterations of it each was a dramatic improvement over the previous.

I expect the Orin Nano to be the basis of the next switch, it would be an easy x8 performance improvement at a similar cost with the same power and thermal properties.
 
Most current gen console games are around 4k/30 native or ~1440p/60 upscaled to 4k with either FSR or some other upscaling method, and often times with DRS enabled to scale resolution as needed to hit their performance target. Although a few recent titles seem to be demanding enough to push resolutions lower to 1080p or so for 60 FPS targets.
Isn,t 4k or 1440p/60 the execption for big games with many going for 900-1200p for 60fps (some lower than that) and around 1400-1800p for 30 fps quality mode ?
 
Isn,t 4k or 1440p/60 the execption for big games with many going for 900-1200p for 60fps (some lower than that) and around 1400-1800p for 30 fps quality mode ?
Like I said, until recent titles. I watch literally every DF video and up until the last few months of some AAA releases (Diablo 4, Jedi Survivor, Forspoken) games rarely went below 1440/30 or 1080/60 with upscaling. Some newer games are still decent though like Dead Island 2 are at like 1728p/60 and Forbidden West DLC (PS5 exclusive are native 4K/30, checkerboard 4K/40 (in 120hz container), or 1800p/60 and that game looks exceptional.
 
He's making something that people want and are apparently willing to pay for. I don't see the problem.
By that same logic why are mods free in the 1st place?

Many of them are much more complicated than "just clicking a couple of buttons".

Imagine all the hours put into complex total conversions and whatnot.
 
Like I said, until recent titles. I watch literally every DF video and up until the last few months of some AAA releases (Diablo 4, Jedi Survivor, Forspoken) games rarely went below 1440/30 or 1080/60 with upscaling.
Well you just divided pixel count by 2 if you did not made a typo from 4k/1440p to 1440p/1080p, so maybe we are saying hte same and it was just a typo or you meant upscaled to 4k 30fps and upscaled to 1440p 60fps, game that could run on a PS4 tended to have people choose really high resolution version for the 30fps (if you can make it run on a ps4 at just 30fps getting close to 4k on a ps5 is realistic), but game made only for the new console will tend to go for much better world-visual at lower res.
 
By that same logic why are mods free in the 1st place?

Many of them are much more complicated than "just clicking a couple of buttons".

Imagine all the hours put into complex total conversions and whatnot.
I am not sure what possibly be the problem, but as for the mods being often free, are they less getting into issue with the studios went they are and many of them easy to copy would they be not free, but i imagine when the door is open paid mod for minecraft and other popular title would exist ? Or Skyrim via Steam workshop:
https://tes-mods.fandom.com/wiki/Paid_Mods
 
I am not sure what possibly be the problem, but as for the mods being often free, are they less getting into issue with the studios went they are, but i imagine when the door is open paid mod for minecraft and other popular title would exist ? Or Skyrim via Steam workshop:
https://tes-mods.fandom.com/wiki/Paid_Mods
I'm against paid mods in the 1st place. Most popular mods are still free.
 
I'm against paid mods in the 1st place. Most popular mods are still free.
That a broad statement, if a game become popular and steam workshop/console marketplace create enough security for people to put a giant amount of work into a mod with the goal of making some money out of that work, thousands of hours work mod and charge for it.

The game being modded can win from it, becoming a more popular title
The modders win from it
And obviously the buyer win from it (or they would not buy it)

win, win, win, no ? Who hurt by this, fully as it can be to engage in work and transaction ? Just some gut feeling or some rational why it would hurt society for people to get paid for working on a piece of software people enjoy ?
 
win, win, win, no ? Who hurt by this, fully as it can be to engage in work and transaction ?
That first part doesn't make sense because then how are big mods even a thing if there wasn't "security"?

How does a buyer win by...paying for something that used to be free?

It's one thing if you're talking about commissioning a project from someone else. Also supporting modders with donations is a thing, but slapping a paywall up first always leaves a bad taste.
 
How does a buyer win by...paying for something that used to be free?
They buy a mod that did not exist before having been made by people that had the expectation of being charged, now that mod exist and they enjoy it more that it cost them (or they would not have bought it) thus winning.

That first part doesn't make sense because then how are big mods even a thing if there wasn't "security"?
They do not come close (i imagine) to the game budget and what in an healthy mod market could look like
 
Well you just divided pixel count by 2 if you did not made a typo from 4k/1440p to 1440p/1080p, so maybe we are saying hte same and it was just a typo or you meant upscaled to 4k 30fps and upscaled to 1440p 60fps, game that could run on a PS4 tended to have people choose really high resolution version for the 30fps (if you can make it run on a ps4 at just 30fps getting close to 4k on a ps5 is realistic), but game made only for the new console will tend to go for much better world-visual at lower res.
Not sure I follow you now, but the resolutions I'm citing are the raw numbers before any upscaling to 4K. Pretty much all console games now are upscaled to 4K (if not native already) from a lower resolution, sans Series S and Switch of course.
 
It was so easy that the modder is charging a fee for it.

One modder charging $5 vs a paid development studio of multiple developers. Yes him charging for it surely proves what again? That's it's not easy because one man did it, because he has to charge $5, and no way a paid development studio of multiple developers could compete with that? Was that the point?
 
Not sure I follow you now, but the resolutions I'm citing are the raw numbers before any upscaling to 4K. Pretty much all console games now are upscaled to 4K (if not native already) from a lower resolution, sans Series S and Switch of course.
Your first message you said:
4k/30 native or ~1440p/60 upscaled to 4k
then divided that pixel count by 2 saying:
1440/30 or 1080/60 with upscaling.
 
Your first message you said:
4k/30 native or ~1440p/60 upscaled to 4k
then divided that pixel count by 2 saying:
1440/30 or 1080/60 with upscaling to 4K.
Fixed.

Keep in mind DRS is usually at play too, so many games I was originally citing can vary between 1440-4k/30 or 1080-1440p/60 with DRS enabled. Most of the time it is not a fixed resolution and is somewhere in between those ranges.
 
One modder charging $5 vs a paid development studio of multiple developers. Yes him charging for it surely proves what again? That's it's not easy because one man did it, because he has to charge $5, and no way a paid development studio of multiple developers could compete with that? Was that the point?
Some rando on Reddit said that this guy is making $18k a month from his Patreon. If there really is that much demand you'd be smart to capitalize on it.
 
Note that the "ray tracing mod" OP is using disables shadows affecting the lighting on the terrain, basically defeating the whole point of using ray traced shadows.

DLSS3 = upscaling + frame gen. You can use upscaling with DLSS3 without using frame gen.

That's not my understanding at all. My experience is that DLSS2 remains the upscaling that is used, and DLSS3 is only frame insertion.

They were supposed to be independent of eachother. You can use one or the other or both as you choose.

This was at least the case at DLSS3 launch, unless they changed their mind.
 
One modder charging $5 vs a paid development studio of multiple developers.
I'm betting on that DLSS3 implementation isn't 100% perfect working. I'm sure the glitches that occur are probably acceptable... for a mod. For a AAA game that's a different situation.
Yes him charging for it surely proves what again?
The few people who bought RTX-40 series cards are willing to pay for DLSS that literally makes the game worse. This is because Nvidia advertises the performance of the RTX-40 series with DLSS. This is why the RTX 4060 and 4060-Ti are a joke to the PC gaming community, because the performance increase from the RTX-30 series are nearly none, unless you factor in DLSS. So of course the owners of RTX-40 series are going to seek out DLSS3 because it justifies their poor purchase.
That's it's not easy because one man did it, because he has to charge $5, and no way a paid development studio of multiple developers could compete with that? Was that the point?
Most mods are free. Why is this DLSS3 mod costing any money? The answer is because it took a lot of work. For what? A feature that lowers image quality and increase input lag? This is why I don't care for DLSS or FSR, because it doesn't benefit gamers.
 
One modder charging $5 vs a paid development studio of multiple developers. Yes him charging for it surely proves what again? That's it's not easy because one man did it, because he has to charge $5, and no way a paid development studio of multiple developers could compete with that? Was that the point?
If it was a simple process to implement. Some other mod maker would drop a free version out of spite.
 
Most mods are usually free if not simply because charging for them can often violate a games TOS or encounter Copyright issues.
They can easily be sued for charging for a MOD on intellectual property they don't own, has happened before. Sad part is the MOD likely makes a few changes and causes some sort of uplift but is likely very broken and not working the way Nvidia intended it to.
 
.. If a modder can put this in, there's no excuse that the developer "couldn't" or "couldn't afford to". They were paid not to...

It was so easy that the modder is charging a fee for it.

There is surely some amount of testing and tweaking that will go in to implementing DLSS... I still hold that the game should have come out supporting all 3 upscalers.

"But consoles! Games are made for Consoles now and those all use AMD gpu's! So, simple they make it for a console then that's what PC gets! It's cheaper!"

There may be some small truth to this, but really, for a triple A title that is going to sell 100 million copies, it doesn't matter if 80% of those sales are consoles, you still have 20 MILLION other customers buying your game to play on PC, and of those between 5 and 10 million have Raytracing and DLSS capable cards. Millions of users. Saying "Oh get used to it, that is normal because that is only 20% of the market", isn't a valid excuse. A game dev making a game to release on different hardware can afford to do what is required so that all copies sold are stable, well performing games. And there's no reason that it shouldn't have varying levels of Texture quality/shadow quality/draw distance, etc. These are features that are standards in games. And the game engines all support all of these sliders.

A new game selling 100 million copies pulls in more than 6 Billion dollars! You trying to tell me they can't afford to put decent raytracing and DLSS feature support in? If only 5 million of those users had Raytracing and DLSS capable cards, that's 300 million dollars in sales. Lack of support is inexcusable.

The way I see it, is that AMD is holding PC gaming back. Console gaming too really. This isn't something any gamer should be ok with.

***

It looks like AMD is sabotaging games, or paying for a feature not to be implemented. But I have changed my mind about that. I don't think they are actively discouraging anything. They also do not 'help'. I suspect the reality is that it is more likely that ALL of the dev studios suffer from a lack of effort to get games working well on multiple hardware platforms (save for iD). Since Nvidia is very helpful with support when a title is sponsored by them, hardware support/graphics implementation/ gets the proper attention it needs to be done right. But when it is an AMD sponsored game, there's no support from AMD for the graphics hardware support/pipline (other than saying "here ya go, go download this tool"), so we get these buggy messes because of /lazy dev/incompetent dev/time crunch/bug mountain/ whatever it is, and AMD sure as hell isn't going to help with any of that.

AMD doesn't really have any excuse these days (for not providing proper support). Their processors are popular and selling in higher quantities than ever before. They have money. Just not the right mindset. They barely serve as competition in the GPU market. If intel gets good at making/supporting GPU's, AMD will become the bottom tier crap. And that will be AMD's own fault. They bought ATI, and (their graphics division) has just been on life support ever since. No new innovative technologies, just copying Nvidia to try and keep up.

Someone posted in another thread about AMD Graphical technologies accomplishments (paraphrased), and pointed out Z-buffering.
That was 2001.
That was ATI, not AMD.
AMD is a CPU company. They have a GPU guy held hostage in the basement. He only gets fed scraps. He makes something now and again, but he sure as hell isn't innovating anymore. He adjusts the rabbit ears on the old TV and sees ads for new toys, then tries to make his own version of the toy from the junk parts laying around. Not always successfully.

It looks like AMD is sabotaging games, or paying for a feature not to be implemented. But I have changed my mind about that.
Edit2: After seeing the GamersNexus video and AMD's response, it's really does look like AMD was blocking competing upscalers. Depending on what releases with Starfield, it will either be the smoking gun, or they will have backpedaled and go "see, we not block competitor tech!".
Multiple Game Dev's have commented all over the internet, stating that if you add in 1 of the upscalers, adding the rest is very simple. Back to the 'no excuse' point.
 
Last edited:
But when it is an AMD sponsored game, there's no support from AMD for the graphics hardware support/pipline (other than saying "here ya go, go download this tool"), so we get these buggy messes
Can confirm.. Amd offers little support for devs compared to nvidia when sponsoring a title.
 
There may be some small truth to this, but really, for a triple A title that is going to sell 100 million copies, it doesn't matter if 80% of those sales are consoles, you still have 20 MILLION other customers buying your game to play on PC, and of those between 5 and 10 million have Raytracing and DLSS capable cards. Millions of users. Saying "Oh get used to it, that is normal because that is only 20% of the market", isn't a valid excuse. A game dev making a game to release on different hardware can afford to do what is required so that all copies sold are stable, well performing games. And there's no reason that it shouldn't have varying levels of Texture quality/shadow quality/draw distance, etc. These are features that are standards in games. And the game engines all support all of these sliders.
Firstly, some PC games sell way more than 20% compared to console. 20% is what you see from a game like God of War when it's already been on console for years. While Nvidia maybe 80% of GPU's used for PC gamers, how many of them are RTX owners? Out of the top 10 GPU's used on Steam, about four of them are GTX cards with the GTX 1660 and 1060 being the top two. None of the GTX cards can make use of DLSS, but they all can make use of FSR. Also, none of the top 10 GPU's are RTX 40-series either, and you won't find one until you reach near the bottom where you finally see the RTX 4070 Ti. So guess how many game devs are thrilled to implement DLSS3?
A new game selling 100 million copies pulls in more than 6 Billion dollars! You trying to tell me they can't afford to put decent raytracing and DLSS feature support in? If only 5 million of those users had Raytracing and DLSS capable cards, that's 300 million dollars in sales. Lack of support is inexcusable.
I think you're over estimating how many people own RTX cards. Probably more than GTX at this point, but not much more and again GTX 1660 and 1060 are the top two cards used on Steam. This is why selling over priced GPU's for the past six years was a really bad idea for the PC gaming market because the average gamer is still using a GTX card. While AMD is certainly to blame for this, but Nvidia is the market leader and AMD was playing along with Nvidia. To ask for DLSS and Ray-Tracing is ignoring the Nvidia GTX owners. AMD is doing more for them with FSR than Nvidia is. How come DLSS doesn't work for GTX owners and AMD's FSR does?
The way I see it, is that AMD is holding PC gaming back. Console gaming too really. This isn't something any gamer should be ok with.
If Nvidia was such a good deal then Microsoft and Sony would have used Nvidia. While the Nintendo Switch is selling great, it's also emulated extremely well on PC due to the lack luster performance the Switch offers. AMD also powers the Steam Deck which is a portable PC/console that kicks the Switches butt.
AMD doesn't really have any excuse these days (for not providing proper support). Their processors are popular and selling in higher quantities than ever before. They have money. Just not the right mindset. They barely serve as competition in the GPU market. If intel gets good at making/supporting GPU's, AMD will become the bottom tier crap. And that will be AMD's own fault. They bought ATI, and (their graphics division) has just been on life support ever since. No new innovative technologies, just copying Nvidia to try and keep up.
I think once FSR 3.0 is out this statement is not going to age well. Also, why do you care for DLSS or FSR when they make games worse?
 
Last edited:
Firstly, some PC games sell way more than 20% compared to console. 20% is what you see from a game like God of War when it's already been on console for years. While Nvidia maybe 80% of GPU's used for PC gamers, how many of them are RTX owners? Out of the top 10 GPU's used on Steam, about four of them are GTX cards with the GTX 1660 and 1060 being the top two. None of the GTX cards can make use of DLSS, but they all can make use of FSR. Also, none of the top 10 GPU's are RTX 40-series either, and you won't find one until you reach near the bottom where you finally see the RTX 4070 Ti. So guess how many game devs are thrilled to implement DLSS3?
Thrilled? That has nothing to do with it. Nothing. MULTIPLE sources, multiple game devs, have all stated that once you do the work to add ANY upscaler, adding in support for the other 2 is trivial. Dev's who put their money where their mouth is and their names to their statements. Not "anonymous dev X" making excuses not to add it. Of which there has been only a single example.
I think you're over estimating how many people own RTX cards. Probably more than GTX at this point, but not much more and again GTX 1660 and 1060 are the top two cards used on Steam. This is why selling over priced GPU's for the past six years was a really bad idea for the PC gaming market because the average gamer is still using a GTX card. While AMD is certainly to blame for this, but Nvidia is the market leader and AMD was playing along with Nvidia. To ask for DLSS and Ray-Tracing is ignoring the Nvidia GTX owners. AMD is doing more for them with FSR than Nvidia is. How come DLSS doesn't work for GTX owners and AMD's FSR does?
Well, I just went thru the May 2023 steam hardware survey, and found 37.69% are RTX 2xxx or higher.
Steam has 120 million active users on a monthly basis.
That's 45.2 Million Raytracing and DLSS 2 compatible cards. DLSS3 is obviously a smaller number of those. Checking: 1.7% of those are DLSS3 4xxx cards. That's over 2 million cards for DLSS3.

So my point stands. Millions upon millions of PC's can do DLSS.

Why use it? If you have to use it, it's superior.

"But many of those cards are low end!" - Exactly, those are the exact users who will want/need to use DLSS. The 4xxx users for the most part can live with it off.
If Nvidia was such a good deal then Microsoft and Sony would have used Nvidia. While the Nintendo Switch is selling great, it's also emulated extremely well on PC due to the lack luster performance the Switch offers. AMD also powers the Steam Deck which is a portable PC/console that kicks the Switches butt.
Nvidia doesn't do consoles because it isn't very profitable apparently. That sucks because the AMD gpus in those consoles ... well, suck too. Hence why it's still better to be a PC gamer even after all the many times the sky has fallen and how many times PC games has been "dead".
I think once FSR 3.0 is out this statement is not going to age well.
Who knows. If it's better, great! That has 0 impact as to whether a game dev can STILL add in DLSS and XeSS support. As it has been pointed out by multiple non-anonymous game dev's, the effort to add extra scalers is trivial.
Also, why do you care for DLSS or FSR when they make games worse?
I do not. lol. I have a card that is fast enough to not need it. Yet again, this has 0 impact as to whether a game dev can STILL add in DLSS and XeSS support. As it has been pointed out by multiple non-anonymous game dev's, the effort to add extra scalers is trivial.

But since you mention it, the FSR implementation in Jedi Survivor is so bad, that the game cannot even get thru certain scenes/areas without crashing, UNLESS you have FSR enabled. I don't want that shit enabled, yet I am forced to enable it to finish the game.
FSR Disabled: 40fps, peaks at 44fps, lows down to 18fps during cutscenes
FSR Enabled (Quality): 40fps, peaks at 44fps, lows down to 18fps during cutscenes
FSR Enabled (Performance): 42fps, peaks at 46fps, lows down to 20'sfps during cutscenes
What's wrong with this picture? Where is the "FSR works on anything" ?? Where is the "FSR improves performance" ? I'll tell you, nowhere. Has ANY game with DLSS ever been this bad? Nope.

As an update to that experience, I subscribed to PureDark's patreon last night. $5 a month, and you get access to ALL of his mods. He has a RE4remake dlss fix, Jedi Survivor dlss fix, The Last of Us fix, and more. It's worth checking out. The Jedi Survivor fix explicity directs the user to play with FSR off, so I will see how it goes. I played a little last night and it looked and played great, but I didn't bother to monitor the FPS.

I will update the post with FPS stats and some screen shots from Alt-F1 Geforce Experience (not from the built in PhotoMode which likely applies extra processing to get better looking stills) later today.

One thing I immediately noticed when I played briefly last night, the opening menu page with the Jedi Temple in the background. You can now see 5 antennas on the top of the towers! Very thin lines that previously were apparently "erased" by the shit FSR. All completely and properly visible now. And I bet they've been there all along. I know I have "before" screenshots of this, so I will show those as well.
 
Last edited:
Back
Top