24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Sad to hear you're completely getting out JBL, but I'm glad you gave us a lot of good posts and insights over the years.

I gotta say thought if I had to get rid of all my tubes and only keep one, it would definitely be a 15kHz set, whether a PVM or even a plain old TV. I would miss my PC monitors but life would go on. But it would be tough to not be able to play Mario Bros 3 or Super Metroid on proper 240p CRT.
 
Sad to hear you're completely getting out JBL, but I'm glad you gave us a lot of good posts and insights over the years.

I gotta say thought if I had to get rid of all my tubes and only keep one, it would definitely be a 15kHz set, whether a PVM or even a plain old TV. I would miss my PC monitors but life would go on. But it would be tough to not be able to play Mario Bros 3 or Super Metroid on proper 240p CRT.
My wife hates it but I just can't let it go, 240p on the CRT can't be beat.

1716292486189.png
 
lol just a quick update, so about 3 weeks ago there was a blackout in my hood and it killed my motherboard (Asrock Z590 Steel Legend) , up until that point i was using a 980Ti with dvi-i to vga passive adapter and ran my desktop usually at 1920x1200 interlaced 144hz on my Samsung Syncmaster 997MB , after it broke i sold the gpu and cpu (i3-10100F) and i had to use my living room computer for a couple weeks, living room pc is Athlon 3000G with igpu, when i hooked it to my 997MB crt the drivers were total garbage, the motherboard had a vga port so i connected my crt there and i couldnt get any resolution besides 1920x1080p60hz to work, they all blackscreen, i think 1440x900 60hz also worked actually, either way...it was shit.

Well yesterday i finally went and picked up my replacement pc for the time being, Gigabyte Z390UD + i3 9100 , limited to iGPU for the time being, this motherboard ONLY has HDMI port, i didnt have any hdmi adapters left, i literally went out and bought a super cheap generic 7U$D hdmi to vga adapter in the streets, just 12 streets down from my home, and i LUCKED OUT lmaooo, this thing can do 1920x1200i 90hz (havent tested further) with perfect blacks and equally sharp picture as the dvi-i to vga adapter , this is so ridiculous hahahahah , i thought i was gonna be limited to 1080i 60hz or some shit, i am soooo happyyyy , interlaced resolutions works EVEN BETTER on the intel iGPU than on the 980Ti with native vga on 368.81 drivers rofl.
 
Nothing in the conversation does let imagine they may come back with a solution for users to fix a serie they know has been faulty from the start, when a simple firmware update shouldn't be any difficult. I think my contact just dropped the line. A real shame of a company.
Just a little update to set things right. I actually got another message when I didn't expect any more. I'm told the firmware can't be updated on those adapters. Whether this is true or not, the situation doesn't change: either they're contacted within the warranty period and they may replace the faulty adapters, otherwise they won't stand behind their product.
 
lol just a quick update, so about 3 weeks ago there was a blackout in my hood and it killed my motherboard (Asrock Z590 Steel Legend) , up until that point i was using a 980Ti with dvi-i to vga passive adapter and ran my desktop usually at 1920x1200 interlaced 144hz on my Samsung Syncmaster 997MB , after it broke i sold the gpu and cpu (i3-10100F) and i had to use my living room computer for a couple weeks, living room pc is Athlon 3000G with igpu, when i hooked it to my 997MB crt the drivers were total garbage, the motherboard had a vga port so i connected my crt there and i couldnt get any resolution besides 1920x1080p60hz to work, they all blackscreen, i think 1440x900 60hz also worked actually, either way...it was shit.

Well yesterday i finally went and picked up my replacement pc for the time being, Gigabyte Z390UD + i3 9100 , limited to iGPU for the time being, this motherboard ONLY has HDMI port, i didnt have any hdmi adapters left, i literally went out and bought a super cheap generic 7U$D hdmi to vga adapter in the streets, just 12 streets down from my home, and i LUCKED OUT lmaooo, this thing can do 1920x1200i 90hz (havent tested further) with perfect blacks and equally sharp picture as the dvi-i to vga adapter , this is so ridiculous hahahahah , i thought i was gonna be limited to 1080i 60hz or some shit, i am soooo happyyyy , interlaced resolutions works EVEN BETTER on the intel iGPU than on the 980Ti with native vga on 368.81 drivers rofl.
VGA docks on MBs are mostly pure trash. Get yourself a Quadro M6000 24GB, the fastest and highest VRAM with DAC.
 
Enhanced Interrogator https://habr.com/ru/companies/ruvds/articles/524170/

Very insightful article. I learnt we're capped at 3800x2500 without software workaround.
Lol, that photo in the article is my FW900 on my desk I built in 2015! I keep seeing this photo on every article online wtf :ROFLMAO:
That's the thumbnail I used for that old video:

View: https://youtu.be/vIIWTMOvisw

fw900_youtube_vid_vIIWTMOvisw.png


380x has the same 400Hz DAC as all of R200-300s. There hasn't been a single GPU since ATi 9700 era with 500/+ Hz DAC, that's why everyone here always recommended people to use Sunix DPU3000-D4 and DeLock-iCYBox rebranded adapters. Only those 3 adapters with HQ RAMDAC over 535Hz default.
Xar, note that the AMD Radeon R9 380X, the R5 250, plus a few others I haven't tested, all have a 655.35MHz RAMDAC after applying the ToastyX Pixel Clock Patcher. And it works absolutely flawlessly!

I was able to do 3840x2160 at 60Hz progressive once, directly out of the R9 380X on my F520 after carefully adjusting the timings. It was not perfect, I could not even center the picture quite right, but I was very impressed at the time.
I had no idea back then that Intel UHD 7xx iGPUs and Intel DG1 dGPU could reach even higher resolutions when going interlaced over DisplayPort, so that below was the highest resolution I had ever reached on a CRT at the time!

PXL_20220830_164801077.jpg


Now for interlaced on the Radeons, the drivers have very very dumb limitations, after a lot of testing, I was able to get this...

5fw4f63b7pzc1.png


So basically, the RAMDAC is outputing 4800x3000 interlaced at 60Hz here, it's possible to go a bit higher as well with like 5000x3125. I was able to confirm the horizontal and vertical frequencies on the monitor's OSD, and everything looked perfectly stable.
Issue is, because of the weird AMD drivers, Windows desktop resolution can't go that high... So it uses the previous available resolution below as the desktop resolution, basically upscaling the 2560x1600 desktop resolution here, up to 4800x3000 to be displayed on the CRT, turns out blurry of course. How dumb!

So now for the limits on interlaced resolutions. Basically, you can't go above 2728 pixels horizontal. After that the resolutions stops showing up in Windows settings...

After a lot of testing taking advantage of the weird resolution scaling behavior above, I noticed that past 2728 pixels horizontal, the higher the horizontal resolution, the higher the pixel clock limit. For example:
- At 3840x2400i the pixel clock limit was 597.01MHz
- Going up to 4080x2550i the limit moved up to 634.32MHz

I tried changing only the vertical resolution, the pixel clock limit remained exactly the same. Only changing the horizontal resolution affected the maximum pixel clock limit. There is some weird math going on in the AMD driver file ATIKMDAG.SYS.
If ToastyX or someone else could figure out a way to patch the driver and remove those weird horizontal pixels limits and maybe also the weird pixel clock variable limit on interlaced resolutions, that would be absolutely incredible! Not sure that can be done, considering it does not look like a value to simply change, but some weird calculations.
Maybe I will drop ToastyX an email to explain to him what I found...

So overall those Radeons definitely have the absolute fastest native VGA output I've ever seen so far. Second best being the Sunix DPU3000, but over DisplayPort. The Radeon wins on progressive with native VGA output, while Intel UHD 7xx & Intel DG1 wins for interlaced over DP with the DPU3000.
Both Radeon and DG1 seems to behave very similarily when used in passthrough with a more powerful GPU taking care of the rendering. The Intel UHD 770 iGPU seems to work quite a bit better, but I need to do some more testing and compare better.

I didn't get any new display. I've been downsizing and decluttering. My kinds don't care about this kind of stuff. I care more about spending time with them. By the time they're out of the house these monitors will be at least 30 years old. No sense in hoarding them to myself.
Oh man, really sad to see you leave the hobby! Family is for sure the most important thing in life!

That said, sad to see you're not keeping any CRT at all, it's a pretty cool hobby especially since you've been around for such a long time!

I was talking to spacediver recently seeing him sell his defective FW900. He told me he had another one and I was relieved haha.
I was telling him how I remember all you guys here welcomed me and helped me through calibrating my monitor and everything back in the days. I was super happy to join this legendary thread, and I am still impressed all the time seeing how active it remains to this day!

On my side I feel lucky both a friend and I had the opportunity to get our hands on a bunch of monitors a few years ago. I will have to go through the recapping process for some if not all of them over the years, to make sure it will last as long as possible. Especially the FW900 since I was not able to find a second one as a backup.
I can't imagine myself selling any of that stuff, I would rather open a museum or something haha. I'm planning on making a gaming room one day, with a wall of retro PC hardware and the CRTs aligned on a large table below. But I don't have any room available for that at this time, so later I will figure something out somehow.

I'm currently thinking of getting a AMD-Xilinx Artix UltraScale+ FPGA dev board with DisplayPort 1.4 sink & source module. The goal being for me to learn how to work with FPGAs, as it would be my first FPGA project, and try to create a progressive to interlaced converter, with as low latency as possible.
That way on the PC I can just output anything progressive directly out of any recent GPU with latest drivers, no compatibility issues, no fighting with old drivers for GPU passthrough, no extra complexity and hopefully ultra low added latency.
Resolutions within the CRT limits could remain progressive and be passed through without any processing, and anything above those limits woud be automatically converted to interlaced using either GTF timings, or custom modelines. Something like that.
I've been looking at every single way I could find to get interlaced with recent hardware, and it's always a pretty meh compromise. So going that route seems to be the only remaining solution.

progressive_to_interlaced_schematic.png
 
Last edited:
By the way, I mentionned the Intel DG1 GPU earlier... I don't think I talked about that here before. Well... imagine my surprise when I learned Intel created a dedicated GPU before ARC existed. ARC is codenamed DG2, and DG1 was its predecessor. It uses PCIe 4.0 8x.
It's a very very quirky Iris Xe dedicated GPU, that absolutely requires the BIOS to have "Resizable Bar" enabled, or it will not even POST. Originally, the card was designed to work only on two specific Asus motherboards, and alongside an Intel CPU. But with that option enabled in the BIOS it will work on any motherboard and also on a Ryzen system.

That said, there will be no output whatsoever during the boot, video output from the DG1 will only be available after the Windows drivers are loaded. While being a bit annoying, it's acceptable, since the goal is to use it for passthrough alongside another GPU.

Also, when switching resolutions on a CRT, it will very often (if not always) disconnect the monitor... Unplugging and replugging is not the solution, you need to go to the Windows monitor settings, and chain click on "Extend these displays" (sometimes 20 times...) until it finally works! Don't ask me how I even found out about that... let's just say I was incredibly annoyed.
I've experienced this a LOT with interlaced resolutions, I don't remember if it's the same on progressive, as the only reason to use GPU passthrough is exclusively for interlaced, as I can connect the VGA adapter directly to the Nvidia card for progressive anyway.

So, this bug is not the end of the world though. I made the following little PowerShell script to automate the "Extend these displays" procedure. It retries as many times as needed until it works, then it automatically stops.
I turned it into an exe with the PS2EXE tool, and assigned it a keyboard shortcut Ctrl+Alt+F12. So anytime the CRT shuts off when switching resolution, I use the shortcut and wait a couple seconds for it to get back on.

Code:
while ((Get-WmiObject -Class "win32_videocontroller" -Namespace "root\CIMV2" | ?{ ($_.VideoProcessor -eq "Intel(R) UHD Graphics Family") -or ($_.VideoProcessor -eq "Intel(R) Iris(R) Xe Graphics Family") } | Select-Object -ExpandProperty MaxRefreshRate) -in ($null, 0)) {
    DisplaySwitch /extend
}

I believe this only works / works best when using the CRT alongside a secondary monitor plugged in to the main GPU. At least that's how I tried it myself. Now if you launch a game that changes resolution on startup, well... That's another problem.
I tried Counter-Strike Source, and fortunately the game has launch parameters to set the resolution, so problem avoided here.

This monitor disconnection issue seems to have been fixed on drivers 31.0.101.4575 and above, not too sure, but unfortunately support for interlaced resolutions was already long gone at this point. So no good here.

Now, why even bother with this weird GPU that doesn't seem interesting at all? Well, while ARC is very very limited with interlaced resolutions around 220MHz pixel clock limit (221.71MHz at 1920x1440i, 223.63MHz at 1600x1200i, so not a fixed pixel clock limit here), the DG1 on the other hand has the same 503.23MHz bandwidth as the Intel UHD 770 iGPU when running interlaced over DisplayPort! That's getting very close to the 540MHz limit of the Sunix DPU3000! This is only doable with earlier driver versions, I am using version 31.0.101.3975, but version 31.0.101.4032 also works. Support for interlaced was dropped mid January 2023 for the DG1, and I seem to remember the iGPUs got one or two more driver revisions before also seeing interlaced resolutions go away.

So this seems to be a very interesting way to do interlaced with GPU passthrough alongside a recent AMD or NVIDIA GPU, on a Ryzen system (without a Intel UHD 7xx iGPU). Also, future generations of Intel CPUs most certainly won't support interlaced output, since support is already gone on latest drivers with current 12/13/14th gen, so the DG1 is an interesting alternative to continue upgrading the CPU to a more modern one in the near future.

With all that said, I finally received my Intel Core i5 13600K last week! So for the time being I will be using it for passthrough with interlaced resolutions. The DG1 is such a hassle, it's still nice to have it for some future use, but for now the current gen Intel is a much better choice.

Now there is the latency issue with running passthrough... a topic for another time. But yeah, it's very low but it's there. It seems lower on the iGPU compared to both DG1 and old Radeons. Not too sure about that.
But either way it's enough to be just noticeable in some games. Which is why I'm now thinking about the custom FPGA hardware solution I mentionned above. Not sure I can get that done, but I would be very interested in trying.
 
Last edited:
Lol, that photo in the article is my FW900 on my desk I built in 2015! I keep seeing this photo on every article online wtf :ROFLMAO:
That's the thumbnail I used for that old video:

View: https://youtu.be/vIIWTMOvisw

View attachment 655183

Didn't know the russian who wrote the thread was a copycat, 🤣
Xar, note that the AMD Radeon R9 380X, the R5 250, plus a few others I haven't tested, all have a 655.35MHz RAMDAC after applying the ToastyX Pixel Clock Patcher. And it works absolutely flawlessly!
Yup. I've already noticed you can 🔓 those DACs nearly a decade back with ToastyX's. It just unlogical and nonsensical for AMD and NV to capped theirs at the 400MHz standard even though they knew its full capacity. The initial rumour even had Kepler at 500MHz before NV clamped it down to 400 like its predecessors since 2006.
I was able to do 3840x2160 at 60Hz progressive once, directly out of the R9 380X on my F520 after carefully adjusting the timings. It was not perfect, I could not even center the picture quite right, but I was very impressed at the time.
I had no idea back then that Intel UHD 7xx iGPUs and Intel DG1 dGPU could reach even higher resolutions when going interlaced over DisplayPort, so that below was the highest resolution I had ever reached on a CRT at the time!

View attachment 655184
This model costs over 4000€ in my region. It's almost as expensive as FW900 and has even better tube than the FW900. I gave up buying it since the CRTs I bought never cost more than 350€ maximum.

Never expected Intel cards to be scaling that well with CRT given how unsupportive they are.
Now for interlaced on the Radeons, the drivers have very very dumb limitations, after a lot of testing, I was able to get this...

View attachment 655185
So basically, the RAMDAC is outputing 4800x3000 interlaced at 60Hz here, it's possible to go a bit higher as well with like 5000x3125. I was able to confirm the horizontal and vertical frequencies on the monitor's OSD, and everything looked perfectly stable. Issue is, because of the weird AMD drivers, Windows desktop resolution can't go that high... So it uses the previous available resolution below as the desktop resolution, basically upscaling the 2560x1600 desktop resolution here, up to 4800x3000 to be displayed on the CRT, turns out blurry of course. How dumb!
So now for the limits on interlaced resolutions. Basically, you can't go above 2728 pixels horizontal. After that the resolutions stops showing up in Windows settings...

After a lot of testing taking advantage of the weird resolution scaling behavior above, I noticed that past 2728 pixels horizontal, the higher the horizontal resolution, the higher the pixel clock limit. For example:
- At 3840x2400i the pixel clock limit was 597.01MHz
- Going up to 4080x2550i the limit moved up to 634.32MHz

I tried changing only the vertical resolution, the pixel clock limit remained exactly the same. Only changing the horizontal resolution affected the maximum pixel clock limit. There is some weird math going on in the AMD driver file ATIKMDAG.SYS.
If ToastyX or someone else could figure out a way to patch the driver and remove those weird horizontal pixels limits and maybe also the weird pixel clock variable limit on interlaced resolutions, that would be absolutely incredible! Not sure that can be done, considering it does not look like a value to simply change, but some weird calculations.
Maybe I will drop ToastyX an email to explain to him what I found...

So overall those Radeons definitely have the absolute fastest native VGA output I've ever seen so far. Second best being the Sunix DPU3000, but over DisplayPort. The Radeon wins on progressive with native VGA output, while Intel UHD 7xx & Intel DG1 wins for interlaced over DP with the DPU3000.
Both Radeon and DG1 seems to behave very similarily when used in passthrough with a more powerful GPU taking care of the rendering. The Intel UHD 770 iGPU seems to work quite a bit better, but I need to do some more testing and compare better.
Amazing finding dude 👍🏼
I never played around 60 Hz or anything lower before. The guys always told me you'd be defeating your own purpose of getting a CRT in 2023. So I went with the usual extreme RR tweaking. I capped at 170 Hz 1920x1440 max on IBM P275 with 3090 Ti using Sunix DPU3000-D4. Will try lower the RR and boosting Res trick like you do. 😍

Have you ever play around with Maxwell (745-Titan X-M6000 24GB) or Pascal's (1030) btw?

Over 8 communities since 2003, I did heard beyond thousands of time ATi/AMD has evidently superior DAC and overall output quality even by TechYesCity's reply.
But about those limitations you encountered, it might be Radeon's exclusive (whether it happened because of ToastyX's magic or good ole' Radeon pipeline calculation bugs). I wanna know if I could ToastyX GeForce/Quadro GPUs beyond the DAC MHz limits with P275 and DPU3000-D4.
 
Last edited:
VGA docks on MBs are mostly pure trash. Get yourself a Quadro M6000 24GB, the fastest and highest VRAM with DAC.
Dude that's only like 10% more powerful than a 980Ti , admittedly it is quite cheap on eBay, i think i saw it for like 200$ on average but just like the 980Ti , interlaced on modern drivers will be limited to the HDMI port, modern nvidia drivers absolutely kill the analog ports check it out if you dont believe me you cant even make 1280x800 75hz render, on my 980Ti it turned the entire screen into a yellow hue mess.

At that point in time i think i'd just go down the passthrough route, pay 100$ more and buy whatever best used rtx gpu i can get, which will probably destroy all maxwell gpus in performance even if it has less ram, the 2060 non-super is already 25% faster in gaming than the Quadro M6000 and Titan X Maxwell.

I will be sticking with this setup for at least 5 more months though, im not working currently and Uni is killing me.
 
  • Like
Reactions: Xar
like this
I might be wrong with that, but does the Nvidia Pro GPUs use a completely different driver compared to the gaming GPUs? Meaning you could run both a RTX with the latest drivers, and the Quadro with correct drivers for interlaced.

Someone mentionned that to me, I completely missed that as I never used any pro Nvidia GPUs before. I need to get my hands on a Quadro to test at some point.

That said, I don't think there is anything better than passthrough for good performance these days. The really annoying part is it adds a little bit of input lag, which is very undesirable.
 
I might be wrong with that, but does the Nvidia Pro GPUs use a completely different driver compared to the gaming GPUs?
Differences are:

-GRD and SD for GeForce RTX and GTX (745-Titan Xp-Titan V CEO Ed.)

-Production Branch/RTX Studio and New Feature Branch/RTX Enterprise for Quadro RTX/RTX xxxx

-DCH package and Standard package

-Windows vers

-Languages
 
Meaning you could run both a RTX with the latest drivers, and the Quadro with correct drivers for interlaced.

Someone mentionned that to me, I completely missed that as I never used any pro Nvidia GPUs before. I need to get my hands on a Quadro to test at some point.
There might be slight differences in how much you could get around playing with the limits each driver between GeForce and Quadro's DAC I reckon. ToastyX and Chief Blur Buster might knew about this.

In terms of components used within each GPU they produced, Quadro is the absolute finest. I imagine the same applies with its DAC (could be even better than Radeon's)
That said, I don't think there is anything better than passthrough for good performance these days. The really annoying part is it adds a little bit of input lag, which is very undesirable.
Yeah. The most convenient method is to past Analog signal through external adapter/converter. Even though input-lag and latency, FPS compensates them.
 
I might be wrong with that, but does the Nvidia Pro GPUs use a completely different driver compared to the gaming GPUs? Meaning you could run both a RTX with the latest drivers, and the Quadro with correct drivers for interlaced.

Someone mentionned that to me, I completely missed that as I never used any pro Nvidia GPUs before. I need to get my hands on a Quadro to test at some point.

That said, I don't think there is anything better than passthrough for good performance these days. The really annoying part is it adds a little bit of input lag, which is very undesirable.
Pascal gpus including the 1080Ti and Titan XP can do interlaced on the newest modern drivers too, but ONLY on the HDMI port and only limited to 400mhz bandwidth i think, even if your adapter can go further.
 
Pascal gpus including the 1080Ti and Titan XP can do interlaced on the newest modern drivers too, but ONLY on the HDMI port and only limited to 400mhz bandwidth i think, even if your adapter can go further.
On the latest driver?! I remember we used to be stuck to 411.70 as the latest driver than could do interlaced... They added it back later? I just read on the Nvidia forum some guy saying 537.58 is the latest driver supporting interlaced early 2024?
Damn, if Nvidia plays with us going back and forth with drivers supporting and not supporting interlaced, it's very annoying and quite an unreliable option.

I'm very curious though, I have a GTX 1080 I will test that.

That said, it's kinda pointless for me as it should perfom just like passthrough with an Intel iGPU that can do a whole lot better with 503MHz and over DisplayPort with better adapters.
And also if Nvidia requires some trickery to get interlaced resolutions to display, as I remember it used to be (is it still like that?), Intel iGPUs just works. You set the resolution in CRT, and once it's done, you just switch resolution in Windows display options, that's all.

I'm gonna test and see by myself.

To me, the only good usecase for this would be if you use the LK7112 adapter, as Intel would be more limited over HDMI than Nvidia here if it goes up to 400MHz interlaced over HDMI.
Or if you intend on using a 1080 Ti standalone, for rendering as well as display. But since interlaced allows for really high resolutions and refresh rates like 2560x1600 at 140Hz or something like that, the 1080 Ti won't really keep up with latest and upcoming titles.
 
On the latest driver?! I remember we used to be stuck to 411.70 as the latest driver than could do interlaced... They added it back later? I just read on the Nvidia forum some guy saying 537.58 is the latest driver supporting interlaced early 2024?
Damn, if Nvidia plays with us going back and forth with drivers supporting and not supporting interlaced, it's very annoying and quite an unreliable option.

I'm very curious though, I have a GTX 1080 I will test that.

That said, it's kinda pointless for me as it should perfom just like passthrough with an Intel iGPU that can do a whole lot better with 503MHz and over DisplayPort with better adapters.
And also if Nvidia requires some trickery to get interlaced resolutions to display, as I remember it used to be (is it still like that?), Intel iGPUs just works. You set the resolution in CRT, and once it's done, you just switch resolution in Windows display options, that's all.

I'm gonna test and see by myself.

To me, the only good usecase for this would be if you use the LK7112 adapter, as Intel would be more limited over HDMI than Nvidia here if it goes up to 400MHz interlaced over HDMI.
Or if you intend on using a 1080 Ti standalone, for rendering as well as display. But since interlaced allows for really high resolutions and refresh rates like 2560x1600 at 140Hz or something like that, the 1080 Ti won't really keep up with latest and upcoming titles.
I was always skeptical if Pascal was initially gonna featured more cards with VGA/DL DVI-I. How they ended up with only 1030 variants featuring integrated DAC is just unusual.

Make more sense to just retire signal converting with the whole Pascal, even better extending it to Turing GTXs.
 
On the latest driver?! I remember we used to be stuck to 411.70 as the latest driver than could do interlaced... They added it back later? I just read on the Nvidia forum some guy saying 537.58 is the latest driver supporting interlaced early 2024?
Damn, if Nvidia plays with us going back and forth with drivers supporting and not supporting interlaced, it's very annoying and quite an unreliable option.

I'm very curious though, I have a GTX 1080 I will test that.

That said, it's kinda pointless for me as it should perfom just like passthrough with an Intel iGPU that can do a whole lot better with 503MHz and over DisplayPort with better adapters.
And also if Nvidia requires some trickery to get interlaced resolutions to display, as I remember it used to be (is it still like that?), Intel iGPUs just works. You set the resolution in CRT, and once it's done, you just switch resolution in Windows display options, that's all.

I'm gonna test and see by myself.

To me, the only good usecase for this would be if you use the LK7112 adapter, as Intel would be more limited over HDMI than Nvidia here if it goes up to 400MHz interlaced over HDMI.
Or if you intend on using a 1080 Ti standalone, for rendering as well as display. But since interlaced allows for really high resolutions and refresh rates like 2560x1600 at 140Hz or something like that, the 1080 Ti won't really keep up with latest and upcoming titles.
yeah, im Argentinian and another Argentinian user from reddit "druidvorse" bought the LK7112 adapter from that turkish guy on eBay to use with his GTX 1060 , that is his only gpu and his only monitor is the same as mine (Samsung Syncmaster 997MB) , i asked him 8 days ago on private message and he confirmed, he can still run 1920x1200i 144hz out of his 1060 on the hdmi port with the 7112 adapter with the latest nvidia drivers, the 400mhz i pulled out of my ass, is just my guess because that is traditionally what they limited us to on their own ramdacs, but i think its the 7112 limit as well anyways, i think the best use case for this would be a 1080Ti with the LK7112 to run somethig crazy like 1680x1050i 165hz or 1920x1200i 165hz , something like that, the 1080Ti can still pull that type of performance in some decent games before 2022, but most importantly, you dont get the added input lag of passthrough.

btw since i havent passthrough ever i wanna know, is the input lag difference actually sensible? do you genuinely feel it when you go from native to passed through??? be honest wit meee :DDD
 
that is his only gpu and his only monitor is the same as mine (Samsung Syncmaster 997MB) , i asked him 8 days ago on private message and he confirmed, he can still run 1920x1200i 144hz out of his 1060 on the hdmi port with the 7112 adapter with the latest nvidia drivers
Well, that's really interesting stuff then! Tomorrow I will fire up a PC I was gonna put for sale (GTX 1080 & i7 6700K), and test that before listing it. I plan on getting a EVGA 1080 Ti Kingpin if I ever find one for a good price one day, it would be really nice to know it is the last GPU that can run interlaced properly natively with the LK7112 adapter all the way up to the adapter limit and on recent drivers too!

the 400mhz i pulled out of my ass, is just my guess because that is traditionally what they limited us to on their own ramdacs, but i think its the 7112 limit as well anyways
I don't remember properly testing the limits of this adapter on interlaced, so I'll finally be able to do that. I will report here when I'm done testing.

btw since i havent passthrough ever i wanna know, is the input lag difference actually sensible? do you genuinely feel it when you go from native to passed through??? be honest wit meee :DDD
So, I didn't play a whole lot, but I can tell you how it felt for me.

When I was testing interlaced passthrough for the first time on the Intel UHD iGPU (i5 13600K & RTX 4060), I started by playing a few games of Call of Duty MW3 Multiplayer, and I didn't really notice the added input lag at first. Everything felt really smooth, like it should at 140Hz.
Then I went to play some WoW Classic, and here after just a few minutes of gameplay flying over the zones I really started to feel like the movements were not quite right. It surprised me a bit, because that's something I definitely would have felt playing a FPS, and now I was noticing this in WoW o_O
So I switched back to direct output from the RTX (progressive, same resolution, lower refresh rate), and yeah the difference here felt really obvious immediately. Went back to passthrough, and for sure, there is noticeable input lag... unfortunately.

So overall, the added input lag is quite minimal I would say, it did not prevent me from winning in Call of Duty :ROFLMAO:, but in some more competitive games I guess it can become a real problem. I'm more of a casual player, but still it annoyed me for sure.

With a good adapter, on direct output, you can get close to maxing out the CRT on progressive. So going through all that trouble pretty much just for interlaced, getting higher refresh rates at potentially higher resolutions but at the expense of slight but just noticeable added input lag... I'm not sure here. Not quite convinced by that setup for now.

That said, I will test some more. The difference here between WoW Classic and MW3 is, I'm running a 4060 and MW3 was running around 130-140fps overall, not really over that. On the other hand, WoW had the potential to go a whole lot higher in fps.
I'm trying to remember if I did cap the fps or not, which could make a noticeable difference in input lag, as I noticed when testing the PCIe GPUs. The iGPU was much more forgiving with that, but still, I need to retest tomorrow and enable VSYNC or just enable the fps limiter in game.
Once again that would be sort of a compromise, but if it lowers the input lag it would be great!

All that is why I'm now thinking of going all out and try to design my own DisplayPort to DisplayPort adapter (DP1.4), that takes progressive on its input and turns it into interlaced on its output, then followed by the 540MHz Sunix DPU3000 for VGA out.
I've always been curious about FPGAs, how to work with them and all. So it's gonna be the occasion for me to have a closer look and see if I can pull this off or not. Very curious about this...
Hopefully that should be as lag free as it gets, and all Windows will see here is a progressive resolution directly from a single GPU, so no weird compatibility issues or anything.
 
Well, that's really interesting stuff then! Tomorrow I will fire up a PC I was gonna put for sale (GTX 1080 & i7 6700K), and test that before listing it. I plan on getting a EVGA 1080 Ti Kingpin if I ever find one for a good price one day, it would be really nice to know it is the last GPU that can run interlaced properly natively with the LK7112 adapter all the way up to the adapter limit and on recent drivers too!


I don't remember properly testing the limits of this adapter on interlaced, so I'll finally be able to do that. I will report here when I'm done testing.


So, I didn't play a whole lot, but I can tell you how it felt for me.

When I was testing interlaced passthrough for the first time on the Intel UHD iGPU (i5 13600K & RTX 4060), I started by playing a few games of Call of Duty MW3 Multiplayer, and I didn't really notice the added input lag at first. Everything felt really smooth, like it should at 140Hz.
Then I went to play some WoW Classic, and here after just a few minutes of gameplay flying over the zones I really started to feel like the movements were not quite right. It surprised me a bit, because that's something I definitely would have felt playing a FPS, and now I was noticing this in WoW o_O
So I switched back to direct output from the RTX (progressive, same resolution, lower refresh rate), and yeah the difference here felt really obvious immediately. Went back to passthrough, and for sure, there is noticeable input lag... unfortunately.

So overall, the added input lag is quite minimal I would say, it did not prevent me from winning in Call of Duty :ROFLMAO:, but in some more competitive games I guess it can become a real problem. I'm more of a casual player, but still it annoyed me for sure.

With a good adapter, on direct output, you can get close to maxing out the CRT on progressive. So going through all that trouble pretty much just for interlaced, getting higher refresh rates at potentially higher resolutions but at the expense of slight but just noticeable added input lag... I'm not sure here. Not quite convinced by that setup for now.

That said, I will test some more. The difference here between WoW Classic and MW3 is, I'm running a 4060 and MW3 was running around 130-140fps overall, not really over that. On the other hand, WoW had the potential to go a whole lot higher in fps.
I'm trying to remember if I did cap the fps or not, which could make a noticeable difference in input lag, as I noticed when testing the PCIe GPUs. The iGPU was much more forgiving with that, but still, I need to retest tomorrow and enable VSYNC or just enable the fps limiter in game.
Once again that would be sort of a compromise, but if it lowers the input lag it would be great!

All that is why I'm now thinking of going all out and try to design my own DisplayPort to DisplayPort adapter (DP1.4), that takes progressive on its input and turns it into interlaced on its output, then followed by the 540MHz Sunix DPU3000 for VGA out.
I've always been curious about FPGAs, how to work with them and all. So it's gonna be the occasion for me to have a closer look and see if I can pull this off or not. Very curious about this...
Hopefully that should be as lag free as it gets, and all Windows will see here is a progressive resolution directly from a single GPU, so no weird compatibility issues or anything.
Hey could you do me a solid if you try the 1080ti interlacing on modern drivers? could you please try Cyberpunk 2077 on 1080i ??? and snap a couple photos at least (if you can shoot a vid even better!) if you cant is all good lol
 
Im honestly still baffled by how well intel uHD handles interlaced, this is insane, it's wayyyy better than nvidia drivers or anything on linux, i can turn off the computer and its all fine, i can switch between different resolutions, i dont have to worry if a game changes the resolution, the transition between progressive and interlaced is super smooth.

I wish there was a way to talk to intel about this, there is gold there.
 
Back
Top