Apple 30" Cinema display on PC :cool:

man those are sweet looking, i saw one at compussr the otherday, and all i can say is WOW :eek:
 
man o man do i love that 30" display :eek:

makes me sad to own a 20.1" samsung with 25 ms responce :( but it is better than nothing ;)
 
We just had a 30" Apple display set up for demoing at the store front, it looks sexy... too bad I don't own a mac, If I knew I could run it on my machine at home easily I would so be interested in buying one but I don't have the patience or the money to get it all done at this time.
 
In response to the question about the temperature: The Prescott chips are notorious for running hot; 65-70 is normal for a stock heatsink. So, today I put in a Koolance Exos Al to cool both the CPU and the GPU. (So yes, you can watercool a Quadro FX 3400 with a Koolance block, I was glad to find out)

So far, I'm pleased with temperature performance. The water is running 34-38 C, and the CPU is running 40-48. Not too bad, considering.

However, the Exos is making a whole lot of noise. That is because it's sucking air into the lines from the reservoir, despite the fact that I have it filled as high as possible. The pump intake is so strong it's creating a little whirlpool and gulping air. Right now I have it turned 30 degrees sideways; that's the only thing that will make it stop doing that and run quietly. Gonna have to call Koolance, I guess.....
 
im going to buy one as soon as there is a gaming video card for it.. so does anyone know when those are coming out?
 
You can play with the QuadroFX 3400. It will perform like a 6800GT, which is its basis. I don´t know when nVidia will release a consumer card with DualLink DVI, but the chances are high that this will take a while. Why selling DualLink DVI for free, when you can sell your workstation line for that purposes...the first samples of the 6800U had DualLink DVI, then someone at nVidia decided to remove the second TMDS transmitter.

Denis
 
Blethrow said:
In response to the question about the temperature: The Prescott chips are notorious for running hot; 65-70 is normal for a stock heatsink. So, today I put in a Koolance Exos Al to cool both the CPU and the GPU. (So yes, you can watercool a Quadro FX 3400 with a Koolance block, I was glad to find out)

So far, I'm pleased with temperature performance. The water is running 34-38 C, and the CPU is running 40-48. Not too bad, considering.

However, the Exos is making a whole lot of noise. That is because it's sucking air into the lines from the reservoir, despite the fact that I have it filled as high as possible. The pump intake is so strong it's creating a little whirlpool and gulping air. Right now I have it turned 30 degrees sideways; that's the only thing that will make it stop doing that and run quietly. Gonna have to call Koolance, I guess.....
how long have you had your computer built up? if its recent, the stepping of that prescott should be D0 and it should not run THAT hot even with the stock cooling. seriously, the vent holes on your case play a role letting the noise out...that's why it's recommended using quiet parts with Lian Li V1200. ALL water coolers are pretty noisy ... that's the trade-off I guess...well, good luck!
 
Blethrow said:
In response to the question about the temperature: The Prescott chips are notorious for running hot; 65-70 is normal for a stock heatsink. So, today I put in a Koolance Exos Al to cool both the CPU and the GPU. (So yes, you can watercool a Quadro FX 3400 with a Koolance block, I was glad to find out)

So far, I'm pleased with temperature performance. The water is running 34-38 C, and the CPU is running 40-48. Not too bad, considering.

I have a few colleagues who uses the Quadro high end video card with just the cooling fan that c/w the unit.

You mention on pg 1 of this thread that it is too much noise. Could it be the machine or something else?

Shouldn't the cooling fan by PNY be enough? After all, they decide the video card and the cooling fan. As I'm not a big fan of having liquid cooling inside my machine. The water is risky.
 
Yep, it's a brand new Prescott. It does run really hot, and I'm quite sure the thermal interface is in good shape.

Regarding the GPU: I'm not really afraid of leaks, so since I'm already cooling the CPU, why not go for it and get rid of one more fan?

I've solved the air problem now, and the Exos is dead quite. I'm really quite pleased with it.
For those who care, I had to get essentially all the air out of the reservoir to keep it from being drawn into the pumps. This is challenging, and required the use of a syringe with a curved needle in order to reach down through the fill hole and then back up to the underside of the reservoir lid, where the air was trapped. Not what I'd call a real selling point of the system. Now that that's fixed though, it's working like a dream.
 
Blethrow, thanks for showing off your awesome setup! I am completely jealous. I have the same video setup, but for some odd reason I'm not able to get mine to work! I get fuzzy, unreadable characters instead at 2560x1600.

What BIOS version is your Quadro FX 3000?

Would you mind posting your advanced timings?

Happy Hopping, nope there isn't a cooling fan on the 30". Its stays pretty cool, and is slightly warm near the controls on the right side panel.

resolution2.jpg
 
Hey Razorfish, very sorry to hear about your difficulties. To be clear though, I'm running a 3400, not a 3000. Nominally, both should work just fine though, so that's very weird. I'm way short on time right now, but I'll pull up details on my card a little later. I can tell you, latest driver though.

One other very weird thing: The 3400 is supposed to have one Dual-Link DVI out and one Single-Link DVI out. Just for the sake of curiousity, I tried using the Single-link out, since that is not supposed to be able to drive this resultion. But it did, no difference. So it kind of looks like someone is lying, most likely PNY/nVidia I guess.

More later.
 
Don't know if this helps, but the BIOS on the 3400 is 5.40.02.17.02. Drivers are 6.14.10.6182
 
Blethrow said:
One other very weird thing: The 3400 is supposed to have one Dual-Link DVI out and one Single-Link DVI out. Just for the sake of curiousity, I tried using the Single-link out, since that is not supposed to be able to drive this resultion. But it did, no difference. So it kind of looks like someone is lying, most likely PNY/nVidia I guess.

More later.

if a single link is running, then any 6800 ultra can run a 30". But that's too odd. It is possible that single link run on static page, but on graphics intense page, it starts to fail. You have to run it constantly to find out or insert some other single link video card to confirm.
 
Yeah, I've double checked that they both work under all conditions. Simplest explanation is that they're both dual-link. I could have sworn the docs said one dual, one single (can't find the docs right now) and the nVidia site indicates the same thing. WTF?
 
Love that computer case, oh the moniter is nice also..
 
Blethrow said:
Yeah, I've double checked that they both work under all conditions. Simplest explanation is that they're both dual-link. I could have sworn the docs said one dual, one single (can't find the docs right now) and the nVidia site indicates the same thing. WTF?

You know what that means? They are teasing you to buy a 2nd 30" :D
 
Maybe either output can be the dual link, but there is only enough bandwidth left over for single link resolution on the other. Buy the second 30" display and let us know.
 
Are you sure it doesn't have 2 dual link outputs?

I know some of the newer Quadro cards do. We will be using them at work to drive IBM T221's.
 
Peter: I was starting to think the same thing. Thing is, the docs specify a particular one as dual and the other as single.

Stereodude: The 4000 and 4400 do to my knowledge. Just curious, what sort of work are the IBM's used for? CAD, I would assume.
 
Stereodude said:
Are you sure it doesn't have 2 dual link outputs?

I know some of the newer Quadro cards do. We will be using them at work to drive IBM T221's.

what's the model name of the quadro at your work place? is it the fx3000, fx3400 or the fx4000, fx4400?

Also, there is another threads saying that w/ 2 x dual link, the 2nd link connected to the 2nd monitor is slow, and depends on the load of the first one. In other words, there is no parallel processing. Is that your situation?
 
I still like the IBM T221 better. With 2.25X the pixels, everything looks smoother on the IBM. Note that in the picture below, the two browser are the same size in terms of pixels.

two.jpg
 
You can't really compare those monitors. The IBM is...what...3x as expensive? It has much higher resolution. Of course you like it better. :) The IBM's design looks questionable next to the Apple though. ;)
 
wjchan said:
I still like the IBM T221 better. With 2.25X the pixels, everything looks smoother on the IBM. Note that in the picture below, the two browser are the same size in terms of pixels.

nice and clean way to pack 10k US$ on one table :D

...i hate you for that, man! ;)
 
wizzackr said:
nice and clean way to pack 10k US$ on one table :D

You didn't notice part of a 2nd T221 on the right side of the picture?
;)

They are not all mine and I didn't pay MSRP for all of them.
 
When the icon designers do their job right, it's fairly easy to pick out a particular icon.
 
wjchan, two IBM's?? that is just awesome... haha

When did they first come out? It's pretty impressive that they can fit so many pixels in only a 22" screen.

The Apple though has got to be easier to read than the IBM's...

I have good news for those on this thread!!! I finally got my ACD 30" to work with a Quadro FX 3000 card! The problem is with some of the very early released versions of that video card. They may not have implemented the dual-link feature correctly. I got a more recently released PNY Quadro FX 3000, and the screen works beautifully with it!! Thanks to all those who provided suggestions and input!
 
razorfish said:
wjchan, two IBM's?? that is just awesome... haha

When did they first come out? It's pretty impressive that they can fit so many pixels in only a 22" screen.

The T220 was announced on 6/27/2001 with a starting price of $22k. The first customer was the Lawrence Livermore Lab. Full-scale production started 3Q01.

The Apple though has got to be easier to read than the IBM's...

Hmm... Your printed page is 300dpi+ and you wouldn't call it hard to read, would you? Operating systems are adding more resolution-independent features. Longhorn, for example, uses device indepdent pixels where each pixel is 1/96". For current generation of OSes, I do have to set my dpi manually and enlarge some UI elements.
 
Here's one of my favorite illustrations. It shows how a 5120x2048 desktop would look on various multi-monitor setups. This all started when someone mentioned he had an 8-monitor setup on his Mac. The picture is to-scale. Note that there's no video card support for a 4-Apple-30" setup on the Mac yet and on the PC side, you'll need a motherboard with 2 PCI Express x16 slots (SLI w/o using the SLI feature)

all1.jpg
 
No question in my mind that the IBM represents the sort of DPI resolution that will be standard down the road. That's a monitor that's ten years ahead of it's time. Unfortunately, there are no video cards that are ten years ahead of their time.

Congrats Razorfish! Gorgeous, isn't it?
 
Is the T221 suitable for web browsing? The text looks kind of small at such a high res on that 22" screen

And WHERE can I get that Apple wallpaper?
 
razorfish said:
I have good news for those on this thread!!! I finally got my ACD 30" to work with a Quadro FX 3000 card! The problem is with some of the very early released versions of that video card. They may not have implemented the dual-link feature correctly. I got a more recently released PNY Quadro FX 3000, and the screen works beautifully with it!! Thanks to all those who provided suggestions and input!

what's ACD?
 
Apple Cinema Display. :)

Hey Happy,

Since you're holding off on this panel until you can get an HP, are you also going to wait for a good SLI capable motherboard? I think I probably would, in your position as I understand it. I'm dying to drop in a second card. ;)
 
Back
Top