Why can't I use a DVI-HDMI cable, and what about HDMI video cards?

InorganicMatter

[H]F Junkie
Joined
Oct 19, 2004
Messages
15,461
Right, I've got this TV, and a nice HTPC with a 6800nu video card. Now, the TV has a VGA input on it, the old blue analog kind. I wanted to do what all you guys suggest, and use the DVI-out port on my card to a DVI-HDMI cable and use one of the HDMI inputs on the TV for best image quality. The problem? This:

User's Guide said:
NOTE
• You cannot connect this TV to a
PC via HDMI/DVI.
So why can't I use a DVI-HDMI cable? Will my TV break if I do?

So my solution (when I build my next HTPC): a video card that has an HDMI port on it. Yeah? Good idea, bad idea?

My other question: how hard is it to set up a DVI cable? I tried to connect a PC to a TV via S-Video once, and it was a hassle. I had to have all these settings with refresh rates and timings and all this complicated stuff that I couldn't figure out so I just gave up on. Do I have to go through all this with HDMI connection?
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
for any PC to TV connection you are going to need to go through complicated refresh rate tweaking. the TV is not a monitor. it does not have the exact same resolution or refresh rate as a monitor, and thus windows does not have a 50Hz refresh rate in as a default which will make your tv happy. so you need to use some software to open up those settings for you.
 
I helped setup a compy for one of my dad's friends...he had a very similar monitor to that Samsung (it was a Samsung in fact, just a different size/model), and after messing around with PowerStrip/nVIDIA drivers for awhile (7300GS in this case), looked in the user's manual, and lo and behold, "HDMI/DVI + Computer = BOOM".

We just ended up using a VGA plug, though I suggested he just get a competently designed model like the Westy 37".
 
I tried to connect my computer with my a DVI/HDMI with my new 32" LCD TV...results were disappointing. My TV didn't support the resolution for HDMI so I had to use a VGA. The picture quality is excellent imo so it didn't matter to me.
 
There is far too much WRONG information in this thread!

To the OP, it says that in the manual because a manufacurter is trying to cover there ass. Connecting you PC to your TV with your HDMI/DVI cable will work fine.

Both my TV and my Dad's TV say the same exact thing. They both work perfectly sending the TV 1920x1080 via DVI. ATI and nVidia have done a really good job of supporting HDTV's with their driver packages. Just make sure you have the latest drivers and you'll be good to go.
 
good to know. I was wondering about this same thing with my Samsung TV. For the bedroom I will be soon getting a Vizio or Olevia haven't decide on which one or what size.
 
valve1138 said:
There is far too much WRONG information in this thread!

To the OP, it says that in the manual because a manufacurter is trying to cover there ass. Connecting you PC to your TV with your HDMI/DVI cable will work fine.

Both my TV and my Dad's TV say the same exact thing. They both work perfectly sending the TV 1920x1080 via DVI. ATI and nVidia have done a really good job of supporting HDTV's with their driver packages. Just make sure you have the latest drivers and you'll be good to go.
QFT
Thank you
 
valve1138 said:
There is far too much WRONG information in this thread!
Exactly. Only in rare cases should powerstrip be used today, when it is needed it's because the TV is doing something weird or it's non standard in several ways (like rectangular pixels for one).

Anyways to add on to what Valve said; the VGA port is limited to a certain resolution, something like 1368x"whatever," even if it's a 1080p/i display because, yes, the manufactures are covering their asses so that if the PC damages the TV in any way (like burn in) then they can point and say "You where using a PC with it!!!11" and then they don't have to do anything since you broke the warrenty by letting a PC touch the TV using a DVI/HDMI cable.
 
One note to the OP. Looking at the specs for your TV, set your res to 1280x720 and you should be fine.
 
valve1138 said:
One note to the OP. Looking at the specs for your TV, set your res to 1280x720 and you should be fine.
Yeah, that's how I run it on the VGA connection now. It works, but the image is a little fuzzy, and there is an inch-wide black frame around the perimeter of the image. Not cool.
 
InorganicMatter said:
Yeah, that's how I run it on the VGA connection now. It works, but the image is a little fuzzy, and there is an inch-wide black frame around the perimeter of the image. Not cool.


I get the same thing running in VGA. Hoping dvi/hdmi will help me get that out of there. Won't be able to find out until I get my new case in cause I am in the middle of transitioning setups.
 
quique55 said:
I get the same thing running in VGA. Hoping dvi/hdmi will help me get that out of there. Won't be able to find out until I get my new case in cause I am in the middle of transitioning setups.
You know, I think I'll let you be the guinea pig. :p :D Post here when you get it finished, and get some up-close pics if you can.
 
big daddy fatsacks said:
for any PC to TV connection you are going to need to go through complicated refresh rate tweaking. the TV is not a monitor. it does not have the exact same resolution or refresh rate as a monitor, and thus windows does not have a 50Hz refresh rate in as a default which will make your tv happy. so you need to use some software to open up those settings for you.

Um that's only partiallly correct!

7800GS SOC, set resolution to 1920x1080, 60hz, plug in cable, connect to 52 inch Aquos, set Aquos to map dot to dot, boom 1080P pops up in the top corner from the TV, no tweaking or anything.

Now my new card is having more of an issue. I just got the 1950 AGP, and it will do 1920x1080, 30hz (1080i), but you put it on 60hz and the video card is limiting the output to something that appears to be 1600x1200... Not exactly sure, but I'm going to try that tweak of using something like 55hz or maybe even try up'n the refresh rate the Sharp Aquos is showing up as a 75hz refresh rate max, might try that to see if that avoid the HDCP that I think is limiting this card. It does say on the package that HDCP 1080i only.. so there has to be a work around to pop it out of HDCP...
 
InorganicMatter said:
Yeah, that's how I run it on the VGA connection now. It works, but the image is a little fuzzy, and there is an inch-wide black frame around the perimeter of the image. Not cool.

Your vid card does support HDCP but at what resolution? I'm guessing 720i then.. I'm getting the same thing but my vid card the 1950 AGP states HDCP up to 1080i, and thus if I use 30hz refresh it does use the full panel, move to 60hz and bam back to teh 2 inch border... ick!

try this... 1280x720 at 30hz refresh
 
Sunin said:
I just got the 1950 AGP, and it will do 1920x1080, 30hz (1080i), but you put it on 60hz and the video card is limiting the output to something that appears to be 1600x1200... Not exactly sure, but I'm going to try that tweak of using something like 55hz or maybe even try up'n the refresh rate the Sharp Aquos is showing up as a 75hz refresh rate max, might try that to see if that avoid the HDCP that I think is limiting this card. It does say on the package that HDCP 1080i only.. so there has to be a work around to pop it out of HDCP...
Thats because ATI's drivers suck when it comes to HDTV out over DVI, this is why NV cards are recommended since ATI hasn't been able to get it's act together for DVI out.

It has nothing to do with HDCP nor is there such a thing as "720i." And no, he's 6800NU does not support HDCP; I don't even know why you're assuming that.
 
CrimandEvil said:
Thats because ATI's drivers suck when it comes to HDTV out over DVI, this is why NV cards are recommended since ATI hasn't been able to get it's act together for DVI out.

It has nothing to do with HDCP nor is there such a thing as "720i." And no, he's 6800NU does not support HDCP; I don't even know why you're assuming that.

Simmer Down Now... LOL...

Well then why with the 1950, does it say HDCP 1080i, if I select that I get full panel, if I select 1080P then i get 2 inch band around the whole image? I agree the drivers are screwy, but there has to be a work around I'm sure.

2nd excuse me that I misread him saying HTPC vs HDCP...
 
I have this Philips, and it doesn't work very well as a monitor.

My Media Center PC is powered by a GeForce 6200 (AGP, so no TC); VGA auto-detects at 848x480, and won't allow any higher OR lower. DVI turns up weird resolutions, nothing with a vertical over 768 or so, IIRC. Hook up my YPrPb cables, and BAM!- just like that, perfect 1080 picture. Weird thing is, the Media Center app shows up perfectly; everything else, starting at the Windows Desktop, is skewed off-center. End result, computer hooked up via VGA on primary display with the desktop and Start menu, and the Media Center app runs on the "secondary" display via RGB. No one ever exits the MCE interface except for Windows Updates, DivX conversions, and the like, so it works out perfectly.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
OK. Since I couldn't wait until my new rig was up I plugged a dvi/hdmi cable into the back of my XPS gen2 lappy that has a 6800ultra card. Border is gone and card autoconfigured to 720p res. I will be watching movies tonite so I will report back on quality.
 
Everything looked good. Might be time to get an XBOX360 HD drive to some high quality stuff. Hope blockbuster starts carrying HD-DVD titles.

DVI-HDMI works no problem on the Samsung sets.
 
valve....not every tv works with this config...as some models, for some inane reason have the HDCP tag set to no for DVI over HDMI. its not gonna blow either up or anything, it just wont work, OR, won't work fully without the digital sound stream coming over HDMI along with video, as is desinged in HDCP for HDMI spec.

some manu's are very loose about factory settings, those that tend to kiss Hollywood's ass alot....you will have more problems with. Frankly, unless you are pumping 1080p thru it, VGA works dandy anyway.
 
Resurrecting this from the grave because I just brought home a Sharp Aquos 42" 1080P set, and the manual tells me the max res from a PC over HDMI is 1280x1024.

Anyone have any experience with this? Of my current PCs, I have an X850 Pro and an X1600Pro (PC upgrade will happen in about 6 months or so).

I'm hearing that 1080P is 1920x1080x60Hz - and if the vid card and driver support it, the TV should have no issues?

I haven't taken it out of the box yet - this piece of info in the manual stopped me cold.

Thanks,

BB
 
My Samsung says to run the computer at 1024x768. I run it at 1280x720 (720p) with no problems at all.
 
Resurrecting this from the grave because I just brought home a Sharp Aquos 42" 1080P set, and the manual tells me the max res from a PC over HDMI is 1280x1024.
They lie. HDMI is speced up to 1920x1080, they say 1024x768 for a PC because it was possible to screw up a TV whe connecting a PC to it due to having to create custom resolutions for it.

You don't really need to do that now and their manuals should be updated to reflect that.
 
They lie. HDMI is speced up to 1920x1080, they say 1024x768 for a PC because it was possible to screw up a TV whe connecting a PC to it due to having to create custom resolutions for it.

You don't really need to do that now and their manuals should be updated to reflect that.

true on the connection, but from memory certain model lines of the Sharps were hamstringed by that very problem over HDMI.

lot of people couldn't get 1080p easily or via a complicated work around. I don't remember, but I DO remember not getting a Sharp for that reason. Now, that was several months ago, a new line may have rectified that issue.
 
AVS forum has several people with the 46" version of this being able to just plug and play 1920x1080. I'll find out if the 42" works ok too, once I get my DVI-HDMI cable.

BB
 
AVS forum has several people with the 46" version of this being able to just plug and play 1920x1080. I'll find out if the 42" works ok too, once I get my DVI-HDMI cable.

BB


might have been the first gen models
 
for any PC to TV connection you are going to need to go through complicated refresh rate tweaking. the TV is not a monitor. it does not have the exact same resolution or refresh rate as a monitor, and thus windows does not have a 50Hz refresh rate in as a default which will make your tv happy. so you need to use some software to open up those settings for you.

sorry. no.


the problem with that is video card / drivers. more modern cards, more NVidia than ATi, pick right up on it, and only with minor overscan tweaking, should be easy to moderate on the difficulty scale.

for the problem with DVI to HDMI, it's not uncommon. Samsungs in particular, don't take PC via HDMI. You'll need a HDMI to VGA converter as many Samsungs will only take A/V sources via HDMI, not PC.

Same with the LN-S##9#D (where the first two # are size in inches, and the second # is trim level for that glass.

zv
 
Cable should get here by Thursday, if it doesn't work with either flavor of video card (I've got on 7600GS to try on it as well), it goes back this weekend, I get a smaller, cheaper 1366x768 37" set (Westy? Phillips? Samsung?) and wait 5 years for something better.

I'll let y'all know how it turns out.

BB
 
My Samsung says to run the computer at 1024x768. I run it at 1280x720 (720p) with no problems at all.

Man, I am so glad to hear that. I just recently purchased that same TV off of Amazon.com ($999, free shipping and no sales tax :). I have also pieced together an HTPC from a spare computer laying around. I was worried that it might not support 720p, but after reading this thread I feel confident it will. Now I'm only worried that my Raedon 9500 won't run the TV at 720p...

EDT: BTW are you running your computer over VGA or DVI/HDMI?
 
Got my DVI-HDMI cable today. Plugged it into input 4, which has the auxiliary L/R audio inputs to accept from the soundcard.

1920x1080 at 60Hz caused the thing not to sync. On a whim, I changed the monitor refresh to 30Hz, and voila! I got what you see (albeit through CRAPPY pics) below.

So - is this the way its supposed to work? It LOOKS like its 1920x1080, but the TV is showing 1080i, not 1080P. So what's happening? I don't see any flicker, so I don't THINK its interlaced.

settings_small.jpg


TV_small.jpg


BB
 
Outputting at 30 means it's interlaced, outputting at 60 would mean it's progressive. Not sure whats going on with your display; it could be that it's weird and has issues with PC connections (some sets are) or it's because of ATI's crappy HDTV over DVI driver related issues that they've never seem to bother fixing.
 
It only says it becaue ATI only outputs it at 30HZ - but there's no flicker, and the one game I play regularly had no issues with it at all. *shrug*

Took it back to BB today anyway, and picked up a 46" Aquos instead. Took the opportunity of a pricematched CompUSA $400.00 instant rebate on top of BB's 36 month interest free finanancing to do the upgrade.

In all my research, I had somehow missed the fact that the 42" had 1200:1 native contrast while the 46 and 52" have 2000:1 native contrast.

The 46 is much much better at blacks. And the PC input still shows as 1080i. Guess I'll have to grab an nVidia card - though I hear Powerstrip can enable the proper 1080p inputs to the TV.

BB
 
Am I the only one getting any luck with ATI drivers and HDTV settings?
I just recently switched from using a NV6600 to an onboard ATI DVI-HDMI hookup with my new NSK2400 and my picture is excellent in MCE. You can customize your own resolutions without the need of powerstrip with the new ATI drivers. Here are 2 screens I took my setup.
1.JPG


2.JPG


Not sure whats so hard about dialing in an ATI card with HDTV settings.
 
Am I the only one getting any luck with ATI drivers and HDTV settings?

Not sure whats so hard about dialing in an ATI card with HDTV settings.
It all depends on what TV you have, some work perfectly fine with ATI cards while others are impossible to get working.
 
It works fine and shows up for me at 1920x1080x30. But the TV shows 1920x1080 interlaced. Was wondering why ATI limits it to 1920x1080x30 and not 1920x1080x60. The 60 is there, but it doesn't sync right and effs up the display. At least, for my X850 Pro. Do you have 1080P in that list? If so, maybe time to upgrade this aging rig one last time to an X1950 Pro.

BB
 
I didnt know that 850pros could do 1080i... i gues the x16?? series should be able to do 1080p. Personally i beleive that 1080p is a marketing ploy if anything.

I have seen 2 tvs side by side (same make and model), 1 set at 1080i and 1 set at 1080p. And while there was a minor difference in image quality too my eyes. it simply wasnt worth the 3000+ dollars for it (i live in australia). so i got a 32inch philips (with pixel plus , and it does 1080i) instead at only 1499.

I have every intention of waiting 5 yrs or so before getting a new tv formyself.. but what gives with people going ooohhh 1080p.... yes i understand how the whole thing works and also the technology behind it. is it worth the hassle... i beleive its worth waiting for the 1080p tvs to be more mainstream and the resolutions to be get higher on the 32inc tvs before going that road.
 
I don't know if it makes a difference for PC display. Not looking to HTPC it with this thing, just game on the thing. I have a few games that will work at that RES - or work windowed with teamspeak server or web browser up next to it.

BB
 
I didnt know that 850pros could do 1080i... i gues the x16?? series should be able to do 1080p. Personally i beleive that 1080p is a marketing ploy if anything.

I have seen 2 tvs side by side (same make and model), 1 set at 1080i and 1 set at 1080p. And while there was a minor difference in image quality too my eyes. it simply wasnt worth the 3000+ dollars for it (i live in australia). so i got a 32inch philips (with pixel plus , and it does 1080i) instead at only 1499.

I have every intention of waiting 5 yrs or so before getting a new tv formyself.. but what gives with people going ooohhh 1080p.... yes i understand how the whole thing works and also the technology behind it. is it worth the hassle... i beleive its worth waiting for the 1080p tvs to be more mainstream and the resolutions to be get higher on the 32inc tvs before going that road.

Trust me, if you are using your HDTV as a monitor, you NEED to run in progressive mode. My 8800 GTX's drivers always default to 1080i when connecting my PC to my Westy 37". I can instantly tell when it's not in progressive mode. The image is slightly fuzzy and i will get a headache from eyestrain within 10 minutes.

That's a lot to pay for a TV, but then again you live in Australia. I bought my Westy for $1000 and i've never had a problem running at 1080p. I love my TV... i bow to it daily.

OP, why do you need to use a DVI-HDMI cable? You couldn't use a DVI-DVI(HDCP)?

EDIT: Nevermind, that HDTV doesn't have DVI inputs.
 
Back
Top