Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
With the FW900 as my main display at home, a pair of IPS panels at work, I noticed the former's richer, if smaller, picture when I got home, but I didn't perceive any delta in physical effects...
2003 for mine. It was the last of them or very nearly so...Sadly.
Ya the clarity was a big selling point to me when I decided to switch. The size was another one, I went from a 22" CRT, which is like 20" in terms of usable, to a 24" widescreen which was real nice. Also related to the clarity was not having to adjust the geometry. One of my least favorite CRT activities was spending an hour trying to get the image to occupy as much of the screen as possible, but still be geometrically correct and in focus. Of course it couldn't hold that, so every so often you got to do it again .I've found LCD's to be sharper and clearer than CRT's for 20 years now, and they lack key modern features like VRR/G-Sync/Free-Sync without which I wouldn't want to run any game these days. As someone who loves ever bigger and better displays and always has, I honestly don't miss CRT's at all
I think they’re great and their motion clarity has not yet been surpassed, except for a select number of VR headsets.To each their own. I honestly don't understand the love for these things.
I hated LCD's when they first showed up in PC's in the 90's, but back then they were really bad. Horrid input lag, ghosting, terrible contrast, you name it.
By 2006 or so, I switched to LCD's and never looked back.
I've found LCD's to be sharper and clearer than CRT's for 20 years now, and they lack key modern features like VRR/G-Sync/Free-Sync without which I wouldn't want to run any game these days. As someone who loves ever bigger and better displays and always has, I honestly don't miss CRT's at all
The GDM-FW900's were amazing for a brief few years from ~2000 (or whenever the first ones were made) to ~2003, but at any time after that I think I would have preferred the top end LCD's of the day.
Anyway, the part I find really surprising is that this "hot, red , flushed face and ears from TV/Monitor use" was such a huge part of every day existence in the 80's and 90's when I started in the hobby, it is absolutely amazing to me that googling it today I can find almost no reference to it.
I'll refer you to the Digital Foundry video on the FW900. There's a reason one of the last new ones in existence went for the price of a new car.To each their own. I honestly don't understand the love for these things.
I hated LCD's when they first showed up in PC's in the 90's, but back then they were really bad. Horrid input lag, ghosting, terrible contrast, you name it.
By 2006 or so, I switched to LCD's and never looked back.
I've found LCD's to be sharper and clearer than CRT's for 20 years now, and they lack key modern features like VRR/G-Sync/Free-Sync without which I wouldn't want to run any game these days. As someone who loves ever bigger and better displays and always has, I honestly don't miss CRT's at all
The GDM-FW900's were amazing for a brief few years from ~2000 (or whenever the first ones were made) to ~2003, but at any time after that I think I would have preferred the top end LCD's of the day.
Anyway, the part I find really surprising is that this "hot, red , flushed face and ears from TV/Monitor use" was such a huge part of every day existence in the 80's and 90's when I started in the hobby, it is absolutely amazing to me that googling it today I can find almost no reference to it.
I'll refer you to the Digital Foundry video on the FW900. There's a reason one of the last new ones in existence went for the price of a new car.
It's arguably the finest display ever made. And it still is. Clear sharp text, modern aspect ratio, big dynamic range, great blacks, amazing clear motion, etc.
Sure, in the office, I'll take a couple or more LCDs, because it's all about the real estate, but for my enthusiast machine I wanted something that performs. And for years in this display dessert of a century there was nothing...
Times are better now. I'm typing this on a CX.I haven't used a CRT in person in almost 20 years, so I haven't exactly seen them side by side, but I can't imagine picking any CRT ever made over any decent LCD from the G-Sync/FreeSync/VRR era.
Touché.I'll actually disagree on cheap LCDs being bad. If they were all that was available sure, but I'm glad that they are there because they bring great display quality to people who don't have tons of money. Cheap LCDs are way better than cheap CRTs were and are cheaper to boot. Cheap CRTs SUCKED. Small, fuzzy, low resolution, etc, etc. They were what most people had to deal with though, most couldn't afford high end CRTs. I can't find historical pricing data, but I seem to recall my no-name 17" monitor back in 1998 was around $300-500. That would be about $570-950 today. That is, at a minimum, midrange monitor money and most people would consider it high end. You can currently spend around $200, which would be like $100 back then, and get a 24" monitor that, while not amazing by today's standards, would completely kick the crap out of that CRT.
I'm happy that more people can get better displays today, and the ultra-high end has survived and there's plenty out there. While, sure, I'd like even better displays than what we have, the high end is really good.
Yeah, when I bought it second hand in the mid 2000s my HP P1110 (Sony Trinitron rebrand) 21" CRT was like... $650? And again, that was used, from a neighbor down the street.I'll actually disagree on cheap LCDs being bad. If they were all that was available sure, but I'm glad that they are there because they bring great display quality to people who don't have tons of money. Cheap LCDs are way better than cheap CRTs were and are cheaper to boot. Cheap CRTs SUCKED. Small, fuzzy, low resolution, etc, etc. They were what most people had to deal with though, most couldn't afford high end CRTs. I can't find historical pricing data, but I seem to recall my no-name 17" monitor back in 1998 was around $300-500. That would be about $570-950 today. That is, at a minimum, midrange monitor money and most people would consider it high end. You can currently spend around $200, which would be like $100 back then, and get a 24" monitor that, while not amazing by today's standards, would completely kick the crap out of that CRT.
I'm happy that more people can get better displays today, and the ultra-high end has survived and there's plenty out there. While, sure, I'd like even better displays than what we have, the high end is really good.
Wasn't the problem rather that it scaled way too much?Indeed as you say, CRT was ultimately doomed, because it couldn't scale
I'll actually disagree on cheap LCDs being bad. If they were all that was available sure, but I'm glad that they are there because they bring great display quality to people who don't have tons of money. Cheap LCDs are way better than cheap CRTs were and are cheaper to boot. Cheap CRTs SUCKED. Small, fuzzy, low resolution, etc, etc. They were what most people had to deal with though, most couldn't afford high end CRTs. I can't find historical pricing data, but I seem to recall my no-name 17" monitor back in 1998 was around $300-500. That would be about $570-950 today. That is, at a minimum, midrange monitor money and most people would consider it high end. You can currently spend around $200, which would be like $100 back then, and get a 24" monitor that, while not amazing by today's standards, would completely kick the crap out of that CRT.
I'm happy that more people can get better displays today, and the ultra-high end has survived and there's plenty out there. While, sure, I'd like even better displays than what we have, the high end is really good.
FALD never died, it was a tech that just required other tech advancement to become better. Early sets couldn't have a lot of zones in a large part because of the computing power needed to properly control them. I men sure in theory you could develop a massive power hungry chip to do it but nobody would pay for that, so what you could get out of the kind of chips we used for control was limited. Smaller sets were simply limited by LED size. Prior to MiniLED getting good you just didn't have small, bright, LEDs that you could use on smaller displays to get good results.I thought FALD was dead for a few years, but maybe I was just paying attention to smaller sets that I could potentially use as a monitor. It all came roaring back though. I think it was Vizio that announced a new line of TVs with every size including some type of FALD. Might have been end of 2013 or in 2014. And with that the dam broke. Or that's how I perceived it anyway. (With I assume OLED's arrival helping spurring FALD advancement.)
My CRT was my media monitor on my gaming computer. I never did any work on that machine. I would do the same with OLED.I like a multi-screen/type approach. It's just I have to have them in different rooms. Such is my psychology...
Baby steps so far, quest 3 multiscreen shows a rudimentary version of that kind of thing. It's still on a larger headset but is prob foreshadowing the way things will go in the long run :
https://www.reddit.com/r/OculusQuest/comments/17cuzz8/spiderman_2_with_the_quest_3/
. . . . . . .
There are some 1080p microoled sunglass style devices that can show a screen in front of you too. Apple pushed their lightweight sunglasses format back to 2027, so might see some greater advancements around then.
meta (still bulky, boxy) VR headset virtual desktop marketing
Apple vision pro (still pretty bulky ~ skii goggles) marketing:
XReal AIR / air2 XR glasses. microOLED 60hz. The device's rez is only 1080p on current models so that desktop rez pictured looks like more of a simulation/marketing here. Still it works and can also play games, e.g. act like a floating display for handheld gaming devices.
nreal marketing
I purchased the Alienware QD-OLED AW3423DW last week from Best Buy while it was on sale. I use it mainly for work and some gaming. So far with cleartype, the text isn't that bad. It just looks a bit blurry to me. I may try that MacType. After 20 hours, no burn in.
I hope it lasts. The faster refresh compared to my older IPS (75hz) and reduction of blurring with fast moving scenes makes it worth it in my opinion. I may change my mind by 2400 hours if/when burn in appears.
My OLED isn't as fortunate. It lives a life of Excel, etc...
Best damn looking spreadsheets ever though, right?My OLED isn't as fortunate. It lives a life of Excel, etc...
Nah you're doing it wrong. You don't have better images if you're not staring at 1000+ nit HDR all day long.That said, I work in a basement office with no windows, and as such I keep the brightness down (because it is not needed, AND because I don't like my eye balls scorched) and I think this helps a lot.
Well, they're definitely bigger on this thing.Best damn looking spreadsheets ever though, right?
Sweet! I also keep the brightness low when connected to my work machine. Minimal really. However, with RTX HDR this may no longer be the case when it's connected to my personal machine.Mine (42" LG C3) lives in Word/Excel/Outlook/Corporate Web-App hell. Anywhere from 8-12 hours a day.
And thus far ~5 months in, it has been just fine. Not even a hint of any burn-in.
That said, I work in a basement office with no windows, and as such I keep the brightness down (because it is not needed, AND because I don't like my eye balls scorched) and I think this helps a lot.
I got the 5 year Best Buy coverage plan on it, so if anything ever does happen, I'll just have them take care of it. (provided they don't go out of business or something).
Only issue I've had is one stubborn intermittent "dead" pixel. It comes and it goes. I suspect it might be screen temperature related, visible when cold, and gone when warmed up, but I am not 100% sure. Thus far I am living with it, but if it gets worse or more appear, I may start talking to the Best Buy plan people.
Mine's on zero for my personal machine. Still debating on whether to leave it on all the time. Maybe though...I run in HDR mode 100% of the time on my C2 but the SDR brightness slider in Windows is set to 5% (frankly leaving it at 0 is fine) as anything brighter is just too bright for the dark room I use it in.
So, I have the TV setup for ideal HDR performance so the brightness on the TV itself is high, as it should be. However, since a large majority of the time I'm viewing my SDR desktop the SDR content brightness slider is very low (as I explained above). This means that I don't have touch anything when moving between SDR and HDR content, its just handled automatically with Windows. The SDR desktop and UI elements are all low brightness for ease of viewing and burn in prevention, but when I play an HDR video its full bright as expected of HDR content.Sweet! I also keep the brightness low when connected to my work machine. Minimal really. However, with RTX HDR this may no longer be the case when it's connected to my personal machine.
Nah you're doing it wrong. You don't have better images if you're not staring at 1000+ nit HDR all day long.
Sweet! I also keep the brightness low when connected to my work machine. Minimal really. However, with RTX HDR this may no longer be the case when it's connected to my personal machine.
Mine arrived with some bad pixels along the edges. No big deal so far. Crossing fingers....