John Linneman (Digital Foundry) "What I can say, because it was on DF Direct, is that I've personally spoken with three devs that implemented DLSS

Well it would be 6 years old if they go with a minimally modified Orin. That does sound like Nintendo. :)
Orin got a refresh in 2020, it would be around 6x faster or more in every metric over the existing switch on the same power draw while giving them DLSS as it is an Ampere base GPU over the Maxwell they use currently.

Based on the tweaks in the leaks I've seen I would expect the CPU clocks going from 1.5 to 1.8, and for the GPU up from 625 to around 700. So it lands squarely between the Jetson Orin Nano and the NX.
 
  • Like
Reactions: ChadD
like this
I feel Jensen can get the Switch 2 deal ? Good chance to be again the biggest sellers of all consoles with a long life time.

Which seem an ideal platform for DLSS (I dout they will aim for the more complex title to do 4k native)

Switch 2? Is that like a Steam Deck?
 
The PlayStation 3 was absolutely essential to the success of Blu Ray. Maybe it’s because you guys are PC centric so you forget but Sony got over 70 million Blu Ray players into homes with the PS3.
Yep, it was one of the cheapest bluray players and could double as a game console.
 
The PlayStation 3 was absolutely essential to the success of Blu Ray. Maybe it’s because you guys are PC centric so you forget but Sony got over 70 million Blu Ray players into homes with the PS3.
It was the primary reason I bought one, at the time it was the best option for a BR device, you could pay less and get less or you could pay more and get the same or less. So get the PS3 and get one of the best BR players available. While also getting access to the FF games as well as their other exclusives.
 
Orin got a refresh in 2020, it would be around 6x faster or more in every metric over the existing switch on the same power draw while giving them DLSS as it is an Ampere base GPU over the Maxwell they use currently.

Based on the tweaks in the leaks I've seen I would expect the CPU clocks going from 1.5 to 1.8, and for the GPU up from 625 to around 700. So it lands squarely between the Jetson Orin Nano and the NX.

Nintendo *loves* buying ancient tech. To the point where it really pisses off Nvidia, they only want to support the latest gen. As much as Nintendo might want that for a Switch 2, I wouldn't be surprised if Nvidia refuses and pushes them on to something newer. And honestly I'm sure there are plenty of people at Nintendo eyeballing Steam Deck components.
 
The PlayStation 3 was absolutely essential to the success of Blu Ray. Maybe it’s because you guys are PC centric so you forget but Sony got over 70 million Blu Ray players into homes with the PS3.
Over its lifetime sure. I think what really sealed Blu though was support from home theater companies. What killed HD-DVD was Warner announcing they would only release on Blu. The problem with HD-DVD is it only really got support form PC companies. MS HP NEC all supported HD-DVD. Blu had the backing of the big theater equip companies, Pioneer, Sony, Hitachi, Samsung. Toshiba was all alone in HD-DVD land when it came to high end home theater equip... and the movie studios wanted high end theater equip for their new hi def format. Toshiba couldn't compete on price vs the PS3 for the masses... but more important to the decision makers they also couldn't deliver at that $1000+ high end player end either. I mean the first Blu player the Samsung BD-P1000 had DVD upscaling already. Toshiba should have just got on board with blu and saved a billion dollars. :)

PS3 popularized blu sure... but blu won the battle before it even started.
 
Over its lifetime sure. I think what really sealed Blu though was support from home theater companies. What killed HD-DVD was Warner announcing they would only release on Blu. The problem with HD-DVD is it only really got support form PC companies. MS HP NEC all supported HD-DVD. Blu had the backing of the big theater equip companies, Pioneer, Sony, Hitachi, Samsung. Toshiba was all alone in HD-DVD land when it came to high end home theater equip... and the movie studios wanted high end theater equip for their new hi def format. Toshiba couldn't compete on price vs the PS3 for the masses... but more important to the decision makers they also couldn't deliver at that $1000+ high end player end either. I mean the first Blu player the Samsung BD-P1000 had DVD upscaling already. Toshiba should have just got on board with blu and saved a billion dollars. :)

PS3 popularized blu sure... but blu won the battle before it even started.
Warner switched while closely examining the intricate details of the technology behind $400 million dollars from Sony. They concluded it was indeed superior.
 
Nintendo *loves* buying ancient tech. To the point where it really pisses off Nvidia, they only want to support the latest gen. As much as Nintendo might want that for a Switch 2, I wouldn't be surprised if Nvidia refuses and pushes them on to something newer. And honestly I'm sure there are plenty of people at Nintendo eyeballing Steam Deck components.
Lot of rumour to be the case, an Ampere based tegra chip, on a better Samsung node like Samsung 4, which sound ideal, minimal competition for the best node-arch for their higher margin affair.

By 2024 launch, Ampere will be kind of old by then (Nintendo could have been playing and developing their launch game with something quite similar to the final version since the release of the T234 in 2021), and Ampere is something Nvidia will sales for years and years as the entry GPUs, they are still selling Pascal in 2023.
 
Nintendo *loves* buying ancient tech. To the point where it really pisses off Nvidia, they only want to support the latest gen. As much as Nintendo might want that for a Switch 2, I wouldn't be surprised if Nvidia refuses and pushes them on to something newer. And honestly I'm sure there are plenty of people at Nintendo eyeballing Steam Deck components.
Technically this is their newer, and Nvidia supports this iteration of the platform until Jan 1 2030, and I fucked up the launch date March 2023 is when they made they refreshed the lineup not 2020.

ETA Prime has a pretty good breakdown on it.

View: https://m.youtube.com/watch?v=nmZ6fhkFmDY
 
Last edited:
10+ millions units sold before Toshiba leaved the market, for an HD media player yes extremely, the bluray able console alone were more than 10 times the total HD-dvd players sold, not sure what the Wii that could not play an concurent (like HD-DVD) has to do with the conversation. PS3 was sold at a lost and cheaper than many HD players.
As a BluRay player, the PS3 was successful. As a game console, it wasn't as much. Remember, BluRay players at the time costed nearly $1k in 2006 money, which is why so many bought a PS3 because $600 was cheaper than $1k. BlueRay won by default, not because it was superior, which it was, but because the world was moving away from physical media.
The idea that the 2006-2008 battle between HD-DVD and Bluray was won because OTT streaming made it irrelevant is complete revisionism, streaming became bigger around 2015, barely existed before 2010, the fact DVD would stay the favorite format and that streaming would make both irrelevant before it matter was not known at the time, DVD was such the best money maker Hollywood ever saw (it was their 90s CD moment of the music industry) in its history that it will work hard to keep that physical medium alive.
Streaming was not meant to take away from physical media, but piracy. The death of the physical format had long arrived before streaming took hold. Remember when DIVX was the pirates choice video codec? Even your own graph doesn't show any significant gains with BluRay, but streaming certainly took off. One can certainly say it wasn't BluRay that killed the DVD star.

The whole point was to show that proprietary standards don't work for very long. Does Nvidia let AMD and Intel use DLSS? Would Nvidia even make changes to their DLSS code to make better use of AMD and Intel GPU hardware? This is like Sony not letting anyone else make BluRay players and discs.
 
Nvidia supports this iteration of the platform until Jan 1 2030

With that support and those specs I'm sure we're looking at the Switch 2. Nvidia's not out of the console market unless they really mess things up somehow.
 
I would like to see a real tech journalist hammer Intel down on that statement again.

I know of only one of those. So get after it ;)

If I hated you, talked shit about you, said you were doing everything wrong, said you were unfair and no good, then handed you $10,000: would you change your ways?

We can test that. I'll PM you my paypal. Start talking shit whenever.
 
As a BluRay player, the PS3 was successful. As a game console, it wasn't as much. Remember, BluRay players at the time costed nearly $1k in 2006 money, which is why so many bought a PS3 because $600 was cheaper than $1k. BlueRay won by default, not because it was superior, which it was, but because the world was moving away from physical media.

Streaming was not meant to take away from physical media, but piracy. The death of the physical format had long arrived before streaming took hold. Remember when DIVX was the pirates choice video codec? Even your own graph doesn't show any significant gains with BluRay, but streaming certainly took off. One can certainly say it wasn't BluRay that killed the DVD star.

The whole point was to show that proprietary standards don't work for very long. Does Nvidia let AMD and Intel use DLSS? Would Nvidia even make changes to their DLSS code to make better use of AMD and Intel GPU hardware? This is like Sony not letting anyone else make BluRay players and discs.
I think the most likely scenario is Microsoft and Kronos create an internal framework for an upscaler that passes some generic dataset back to the drivers and they let AMD, Intel, and Nvidia choose how to deal with that data that is passed to them. Make it all a driver-level thing, Nvidia can pass that to whatever tensor, optical flow, or accelerated hand job tachometer cores they want too, and AMD and Intel can do their own things. Let them fight it out at a hardware level but at least get things standardized so developers only need one output interface to deal with.
 
Streaming was not meant to take away from physical media, but piracy. The death of the physical format had long arrived before streaming took hold. Remember when DIVX was the pirates choice video codec? Even your own graph doesn't show any significant gains with BluRay, but streaming certainly took off. One can certainly say it wasn't BluRay that killed the DVD star.
Sure it is not bluray that killed the DVD, we were talking if Toshiba leaved the race with the bluray for the new HD format because of the streaming subscription model made physical irrelevant and not lost it. This is having the time line a bit off

When the battle occured (2005 to 2008), the physical disk for movies business was a giant one, monthly ott streaming account barely existed, to put it in perspective Netflix will start offering streaming only subscription only in 2010 and only started streaming movie at all in 2007.

4b9021b97f8b9aca24950400?width=400&format=jpeg.jpg


And it stayed a bigger one than non-physical for a long time:

MPAA-total-spending-640x292.png


The whole point was to show that proprietary standards don't work for very long.
Yes and my whole response to your point was to be suspicious of it, by naming a long list of proprietary standard working for a very long time, x86 has being going strong since 1978, dvd was a giant success, we can find list of open Standard that failed to catch on, proprietary that failed.

changes to their DLSS code to make better use of AMD and Intel GPU hardware?
They would just need to be API agnostic imo for DLSS to not die, as long as anytime you put motion vector Upscaling on that it work regardless if it is dlss-fsr-xess, otherwise it will be rough, but DLSS does not need to work on other cards at all for that. Would the competition accept for DLSS/FSR/Xess to work on a single agnostic API ? If they fear to never catch-up and have nothing to win with this, maybe not.
 
You misunderstand what I was saying a bit. Nvidia is the one not in any position to bargain. When FSR gets rolled into DX (Its coming)... NV might be able to save some face if MS allows it. They don't need Nvidias DLSS for anything. DLSS will die, just like Betamax died. Superior tech doesn't matter if 70% of a market is using another standard.
I was simply saying it would be fantastic if Nvidia saw the writing on the wall and helped the new DX standard along if they can.

Hey when things like the Minidisc died... we lost superior tech like ATRAC compression, and got stuck with far crappier MP3 for years. Hopefully Nvidia who has in general played well with graphics standards for years comes to their senses a bit. Ya never know it could happen. haha

It doesn't make sense as part of DX. A temporal upscaler is at odds with every graphics API on the planet because it's an intrinsically higher level thing than any API ever deals with.

The core APIs don't know what shadows are, they don't know what models are. They don't even have a concept of a camera or any of the typical math primitives like matrices and vectors, and it certainly has zero idea of the multiple things you need to feed into such a system for it to even work. These are all constructs built and handled by a developer.

Same with ray tracing. It's a super basic, foundational thing. They give you the tool and it's on you to do something with it. You get a way to build the acceleration structure, you get a way to fire rays, and that's basically it. It doesn't know what a reflection is, or a shadow. It lets you fire a ray and intersect a triangle and that's about it.

The APIs are an interface to the hardware first and foremost, and the driver provides the implementation. This isn't a hardware function. Which is why it's a software library.

Now you're introducing concepts like a camera, timers, motion vectors, and other shit into the API that was previously oblivious to it all. Which again, is why at best it'd wind up as a Microsoft sanctioned example/library.
 
It doesn't make sense as part of DX. A temporal upscaler is at odds with every graphics API on the planet because it's an intrinsically higher level thing than any API ever deals with.

The core APIs don't know what shadows are, they don't know what models are. They don't even have a concept of a camera or any of the typical math primitives like matrices and vectors, and it certainly has zero idea of the multiple things you need to feed into such a system for it to even work. These are all constructs built and handled by a developer.

Same with ray tracing. It's a super basic, foundational thing. They give you the tool and it's on you to do something with it. You get a way to build the acceleration structure, you get a way to fire rays, and that's basically it. It doesn't know what a reflection is, or a shadow. It lets you fire a ray and intersect a triangle and that's about it.

The APIs are an interface to the hardware first and foremost, and the driver provides the implementation. This isn't a hardware function. Which is why it's a software library.

Now you're introducing concepts like a camera, timers, motion vectors, and other shit into the API that was previously oblivious to it all. Which again, is why at best it'd wind up as a Microsoft sanctioned example/library.
Instead of treating upscaling as a thing separate from everything else why not roll it in with AA, both DirectX and Vulkan have programable Anti Aliasing interfaces, because really that is what the upscaling is doing, smoothing out the blurry ass edges of a low-resolution model, it just happens to be bringing it up to the render resolution in the process. So if DX12 can have their programable MSAA interfacing as part of the spec why not add in some form of upscaling to the anti-aliasing?
 
What? Blu-ray won because Sony spent billions of dollars bribing the movie studios to release only on bd.

I vaguely remember thinking that HD-DVD was superior but have no recollection of why.
 
What? Blu-ray won because Sony spent billions of dollars bribing the movie studios to release only on bd.
Not just that they also pushed, heavily, the idea that their BD-J protection layer on top of AACS (which both had) made Blu-ray so much more secure against those evil pirateses and that it wouldn't be cracked for decades (LOL). There was also the lesser, but still significant, fact that the PS3 was a Blu-ray player. Back then players cost multiple hundreds of dollars and up, so buying one was a big investment. However if you'd already gotten a PS3 you had a Blu-ray player, which made it more attractive rather than buying a separate HD-DVD player.

I vaguely remember thinking that HD-DVD was superior but have no recollection of why.
From a visual standpoint, they were the same. They had the same options for compression and all that kind of stuff so that didn't matter. The part that could be argued to be superior was that HD-DVD was cheaper to produce, it didn't require the same level of retooling a factory that Blu-ray did.
 
. They don't even have a concept of a camera or any of the typical math primitives like matrices and vectors, and it certainly has zero idea of the multiple things you need to feed into such a system for it to even work. These are all constructs built and handled by a developer.
That changed a lot (i imagine Dx12-Vulkan ?), back in my days:
https://docs.gl/gl3/glMultMatrix
 
TAA and upscaling works because it literally moves the camera to try to accumulate more information.

So now you need a function that jitters your camera. Except there's no such thing and a camera per se. So you really need a function that jitters the underlying view matrix. Except a matrix still isn't defined anywhere either. DirectX doesn't define what you're dealing with here, it's the application's job. So now you're already adding an abstraction unlike anything else in the API.

And it needs to happen over time, over multiple frames. DirectX doesn't give a shit about tracking anything here, because why would it - it's your job to tell it what to do.

And you probably just want the actual 3d world because you probably don't want to blur your UI. DirectX has no clue where the line is drawn, it's the apps responsibility.

Then you need to work out more camera and motion fuckery to cope with ghosting. Why would DirectX deal with this? It's the apps responsibility.

Versus programmable MSAA where you just move the sample location within the pixel grid and the hardware goes brrrr all the same and that's the end of the story. This is a hardware feature.

TAA is not, it's an amalgamation of all these different steps that the application needs to be acutely aware of to cumulatively build into the end effect. Which is why it's a software library, and why no driver can provide a generic version that you just turn on and it works.
 
That changed a lot (i imagine Dx12-Vulkan ?), back in my days:
https://docs.gl/gl3/glMultMatrix

Because this is the fixed function pipeline and hasn't mattered for literal decades.

Once you get past that, they just provide a way to upload a struct of... data. Could be a vector, time, matricies, whatever - doesn't really matter, it's just a block of bytes that gets copied in the grand scheme of things and when you write the shader, you define how to interpret it.
 
Last edited:
I vaguely remember thinking that HD-DVD was superior but have no recollection of why.

It surprised me HD-DVD didn't win. Simply for the reasons that everyone knew what DVD was, and for a few years by that point consumers had been beaten over the head with the phrase "High Definition". Seemed a no brainer, but Sony shenanigans won the day. :eek:
 
  • Like
Reactions: Axman
like this
It surprised me HD-DVD didn't win. Simply for the reasons that everyone knew what DVD was, and for a few years by that point consumers had been beaten over the head with the phrase "High Definition". Seemed a no brainer, but Sony shenanigans won the day. :eek:
Sony gave BluRay authoring hardware and software to the largest producer of porn at the time (Vivid Entertainment). It was a slam dunk from there. VHS had porn Betamax didn’t VHS wins, BluRay had porn HDDVD didn’t, BluRay won. Phones had small screens, porn goes HD online, phones get bigger screens.
Never bet against booze or porn, you will loose almost every time.
 
Sony gave BluRay authoring hardware and software to the largest producer of porn at the time (Vivid Entertainment). It was a slam dunk from there. VHS had porn Betamax didn’t VHS wins, BluRay had porn HDDVD didn’t, BluRay won. Phones had small screens, porn goes HD online, phones get bigger screens.
Never bet against booze or porn, you will loose almost every time.
Nah, wasn't that simple with either VHS or Blu-Ray. I'm not saying porn didn't help, but the big thing VHS had going for it was runtime. OG Beta tapes were pathetically small. The max length of the tapes was only 60 minutes, VHS launched at 120-minutes for SP speed. It made VHS a lot more interesting to consumers since tapes were quite pricey and you got literally double the time. Also was interesting to studios, since while you can't fit all movies on a 2-hour tape, you can fit many, you aren't fitting basically any on a 1-hour tape. Both increased tape sizes later, and also offered slower recording speeds to get more length for less quality, but VHS was always on top, mostly because the tapes were physically larger and so could hold more tape. That, combined with the fact that VHS got licensed out pretty quick so lots of companies could make them at different price points was what really did it. People wanted more recording time, and cheaper units, and VHS had that.
 
Relevant to this topic, Jedi Survivor adds DLSS in the recent patch:

Jedi Survivor Patch 7 Change log

Patch 7 Details - September 5​


Patch 7 for Star Wars Jedi: Survivor arrives today for PC, PlayStation 5, and Xbox Series X | S.
Here are the fixes you can expect with this patch:
  • This patch introduces several performance-related improvements* on PlayStation 5 and Xbox Series X/S including:
    • Performance mode has been completely reworked to substantially improve player experience.
      • A number of GPU and CPU optimizations – along with disabling Ray Tracing – has resulted in a better player experience, including a solid 60 FPS in Performance mode.
    • Quality Mode has also received optimizations to help reduce FPS fluctuation and introduce other visual improvements.
  • Variable Refresh Rate support added for PS5.
  • Additional performance & optimization improvements for PC, including DLSS support.
  • Save system tweaks to help prevent save game corruption.
  • Fixed issues where players could not retrieve their XP after dying under certain circumstances.
  • Various crash fixes.
  • Various bug fixes & improvements across all platforms, including fixes for cloth, lighting, and UI.
* Note: Cinematics in Star Wars Jedi: Survivor on console are locked to 30 frames per second.
Thank you all for the continued support you’ve given Star Wars Jedi: Survivor while we’ve been hard at work on patches. As always, let us know if you run into any additional issues.

Wonder what this might mean for Starfield, if anything.
 
Relevant to this topic, Jedi Survivor adds DLSS in the recent patch:

Jedi Survivor Patch 7 Change log



Wonder what this might mean for Starfield, if anything.
Meh. Was always going to be added at some point. The issue was that the Nvidia faithful wanted DLSS prioritized on AMD sponsored titles. Something that even Nvidia never did.

Glad it's there though.
 
Meh. Was always going to be added at some point. The issue was that the Nvidia faithful wanted DLSS prioritized on AMD sponsored titles. Something that even Nvidia never did.

Glad it's there though.
Not necessarily. Some studios are pretty lazy about that shit. There's a lot of games out there with only one not the other, and plenty of other games that have a really old version of one in them. So there was a non-zero chance that Respawn just said "fuck it" and never did DLSS.

Same shit with Starfield. I would hope it gets added later, there is no good reason for it not to, but they may not. They may decide their time is better spent on other things, even though it wouldn't take a ton of time, or they may just not care. For that matter MS may put their thumb on the scale. One thing that can be said for Starfield is it makes the Xbox Series X look pretty good. DLSS, particularly with frame gen, would give nVidia equipped PCs a non-trivial advantage. Maybe they decide they don't want that, they'd rather this makes people want to buy an Xbox.
 
The issue was that the Nvidia faithful wanted DLSS prioritized on AMD sponsored titles.
Not sure what this mean, but I doubt many if one Nvidia faithful wanted DLSS prioritized over FSR on an AMD sponsored title, why would they mind if both option are available and if there is only icon on the splash screen about FSR and the first option available under upscaler ? Like Diablo 4, launch with FSR 2 but have all the marketing material be about DLSS
 
Last edited:
Meh. Was always going to be added at some point. The issue was that the Nvidia faithful wanted DLSS prioritized on AMD sponsored titles. Something that even Nvidia never did.

Glad it's there though.
Nice revisionist history, but it was never a known thing.
 
It surprised me HD-DVD didn't win. Simply for the reasons that everyone knew what DVD was, and for a few years by that point consumers had been beaten over the head with the phrase "High Definition". Seemed a no brainer, but Sony shenanigans won the day. :eek:

One thing people forget also is that HDDVD had horrible problems with defective discs and bitrot. I worked in a video store when they started carrying both formats, and had tons of defective returns with bad HDDVD discs. Even good ones seemed to go bad in a short period of time.
 
Yes and my whole response to your point was to be suspicious of it, by naming a long list of proprietary standard working for a very long time, x86 has being going strong since 1978, dvd was a giant success, we can find list of open Standard that failed to catch on, proprietary that failed.
There's no CPU standard that isn't proprietary other than RISCV. Even still, x86 had a lot more manufacturers other than AMD and Intel, it's just that we've distilled x86 down to AMD and Intel. Everyone was able to make DVD players and movies, unlike DLSS which is only limited to Nvidia.
They would just need to be API agnostic imo for DLSS to not die, as long as anytime you put motion vector Upscaling on that it work regardless if it is dlss-fsr-xess, otherwise it will be rough, but DLSS does not need to work on other cards at all for that.
If AMD's FSR3 works as good as DLSS3 then DLSS will die. Nvidia knows this, which is why suddenly DLSS3.5 is out before FSR3. DLSS3.5 isn't even an upscaler now, but it's directed at Ray-Tracing. It's pretty clear Nvidia doesn't want to lose money on the marketing that is on the DLSS name. RTX owners are foaming at the mouth over games not including DLSS, which is why modders now charge a fee, because they know you will pay.

It's possible that developers didn't include DLSS because they felt that FSR works equally as well. You can point out a zoomed in section when FSR is worse than DLSS, but you won't really notice it. When this video was posted showing the "differences" of FSR vs DLSS in Starfield, the malaka didn't even know that without FSR enabled the mod doesn't work, but he was praising how much better DLSS was without FSR on. He couldn't tell that DLSS wasn't even working. It's getting to the point where it starts to look like a placebo. The DLSS mod even breaks some parts of the game, because again this is a hack job, but you all love it.
Would the competition accept for DLSS/FSR/Xess to work on a single agnostic API ? If they fear to never catch-up and have nothing to win with this, maybe not.
If the API was done with good reasons. This is why I'd rather have Microsoft and KronosGroup handle it, instead of AMD or Nvidia. It's likely that AMD's FSR might make it as the standard. This won't mean that you couldn't include DLSS into games with Microsoft's or Vulkan's built in upscaler, but I wouldn't expect to see a lot of games with DLSS if this option exists.
Nice revisionist history, but it was never a known thing.
You guys were ready to crucify anybody who disagreed with the notion that AMD paid developers not to include DLSS. The hypocrisy from this is beautiful. We had a whole thread closed over this, but now the game finally has DLSS. kac77 is right, you guys are upset that DLSS wasn't a priority, in a AMD sponsored game. Why you think there's a paid mod for a crappy implementation of DLSS? The mod creator knew you guys were impatient and exploited this. How does it feel to pay for a mod that is now useless and inferior? I bet you Starfield will eventually get DLSS. Probably won't be anytime soon. This also brings into question the credibility of John Linneman's (Digital Foundry) statement.
 
Everyone was able to make DVD players and movies
Everyone that was accepted for a license, maybe you do not mean proprietary since the beginning but always meant closed by that word ?
http://www.dvd6cla.com/royaltyrate.html

. kac77 is right, you guys are upset that DLSS wasn't a priority, in a AMD sponsored game.
That quite speculative that it needed to be a priority, if both are available at launch why would that make DLSS a priority.

the malaka didn't even know that without FSR enabled the mod doesn't work, but he was praising how much better DLSS was without FSR on.
And it was strange because the shadow on the right of the screen looked significantly worst imo... But it is someone making a video it does not necessarily believe what he is saying, if he feels a video about a mod is more interesting if the mod is worth it.

You can point out a zoomed in section when FSR is worse than DLSS, but you won't really notice it.
Without zoom at 27s here, you do notice it right ? (or it is me, I am someone that do not seem to care much for high FPS for example so I guess everyone has different things they remark, but here we seem a lot more of shimmering without dlss on)

View: https://youtu.be/ZtJLCAWSzR8?t=27



This also brings into question the credibility of John Linneman's (Digital Foundry) statement.
What do ? Do you think any deal-quiproquo of the sort would not be about a certain window of time and not the lifetime of the game (like for when the game is bundled with AMD hardware sales)?
 
Last edited:
  • Like
Reactions: T4rd
like this
It doesn't make sense as part of DX. A temporal upscaler is at odds with every graphics API on the planet because it's an intrinsically higher level thing than any API ever deals with.

The core APIs don't know what shadows are, they don't know what models are. They don't even have a concept of a camera or any of the typical math primitives like matrices and vectors, and it certainly has zero idea of the multiple things you need to feed into such a system for it to even work. These are all constructs built and handled by a developer.

Same with ray tracing. It's a super basic, foundational thing. They give you the tool and it's on you to do something with it. You get a way to build the acceleration structure, you get a way to fire rays, and that's basically it. It doesn't know what a reflection is, or a shadow. It lets you fire a ray and intersect a triangle and that's about it.

The APIs are an interface to the hardware first and foremost, and the driver provides the implementation. This isn't a hardware function. Which is why it's a software library.

Now you're introducing concepts like a camera, timers, motion vectors, and other shit into the API that was previously oblivious to it all. Which again, is why at best it'd wind up as a Microsoft sanctioned example/library.
Ray tracing is part of DX. Nvidia does not have some way of coding tracing outside of DirectX or Vulcan. Nvidia follows a standard... even when they launched it they did so with Microsoft. I know people find it hard to believe but Ray Tracing was in the pipe prior to Nvidia. First hardware but they didn't actually get to that one on their own.
https://developer.nvidia.com/rtx/raytracing/dxr/dx12-raytracing-tutorial-part-1
https://learn.microsoft.com/en-us/a...june/directx-factor-the-canvas-and-the-camera
https://learn.microsoft.com/en-us/windows/uwp/gaming/implementing-depth-buffers-for-shadow-mapping
https://microsoft.github.io/DirectX-Specs/d3d/Raytracing.html

Yes DX Shadows Camera control are basic functions of DX. (DX calls refer to EYE instead of camera but same thing) Ray tracing is implemented via DX. Adding something like FSR would really not be any more an issue then adding the RT functions where. (technically I believe it would be actually less work) It would need to be added in a way that is compatible with the way other things are handled but there is not technical reason it can't be added. Same goes for Vulcan.
 
Last edited:
Back
Top