HD video

Whitebread

[H]ard|Gawd
Joined
Jan 28, 2002
Messages
1,363
I need a little help choosing a vid card here. I want a card that can display DVDs and HD video on a gateway FPD2185W 21 inch widescreen without a problem (display running at native rez). It has to be PCI-E, I'm indifferent as to GPU manufacturer and has to have DVI. I don't want to spend too much money, the cheeper the better. Remember, I don't play games, and I don't plan on doing so in the future.
 
The 6600 GT is a good choice. Pretty much the cheapest solution if you want full HD hardware decoding up to 1080i.
 
6150 boards do HW decoding up to 1080p, and the cost would be half as much as a 6600gt.
 
curtisfong said:
6150 boards do HW decoding up to 1080p, and the cost would be half as much as a 6600gt.

No... the integrated 6150 boards DO NOT have HD hardware support past 720p, so if you watch anything with a higher res (like 1080i), it would be jerky as hell. ALL 6000-series cards have some H.264 acceleration, but only 6600GT (not vanilla) and above support it completely up to 1080i.

Also, there is no such thing as 1080p. ;) Well, there might be in the future, but there isn't any progressive HD video at that resolution yet...

However, for any of this hardware acceleration to be in effect, the video would have to be encoded in H.264 format... not too hard if you have FDDShow though.
 
PWMK2 said:
No... the integrated 6150 boards DO NOT have HD hardware support past 720p, so if you watch anything with a higher res (like 1080i), it would be jerky as hell. ALL 6000-series cards have some H.264 acceleration, but only 6600GT (not vanilla) and above support it completely up to 1080i.

Also, there is no such thing as 1080p. ;) Well, there might be in the future, but there isn't any progressive HD video at that resolution yet...

However, for any of this hardware acceleration to be in effect, the video would have to be encoded in H.264 format... not too hard if you have FDDShow though.

what do you mean no such thing as 1080p. All of quicktimes videos go that high, and as far as i know wmv-hd also goes that high. id definately want a card that can do 1080p or youre just wasting your money
 
bealzz said:
what do you mean no such thing as 1080p. All of quicktimes videos go that high, and as far as i know wmv-hd also goes that high. id definately want a card that can do 1080p or youre just wasting your money

Well, considering there is NO content in 1080p, I don't know why you'd be wasting your money. Not being able to go up to 1080i, on the other hand, IS wasting your money.

I'll say it again. There is no progressive 1920x1080 format (1080p). There is only interlaced 1920x1080 (1080i). Theoretically such a standard could exist, but it would require some pretty advanced hardware...
 
ALL 6000-series cards have some H.264 acceleration

As of right now, NO NVIDIA cards have any kind of H264 acceleration enabled. Nvidia is promising them in the forceware 85 series of drivers.........If you want HD H264 right now, then you will have to go the ATI route.

There is no progressive 1920x1080 format (1080p)

Ive seen lord of the rings : return of the king in 1080p, although in respect it could have just been a 1080i that somebody had upconvered........however it didnt seem this way as the quality was excellent with absolutely no signs of interlacing.........although it was about 30GB in size :(
 
H264? HD H264? What is all this? I'm going to assume its hardware support for HD video types? Support for future 1080i, or 1080p (if we get any content) would be nice to have in a video card, I am trying to assemble a system that will last for a few years. Would I need to spend large amounts of money for this?
 
I've built a box around the MSI 6150 board. Aspire case, 3200+ cpu, < 50% utilization, xp90 (no fan), good temps.

HD content (incl. 1080p) can be found here and here.

6150 specs here.

No stuttering on the box I built. Looks great on a Sceptre 1080p TV!

No H264 hw deocde yet..Nvidia has promised this in future drivers as previous posters have stated.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Currently movies on DVD are encoded using just 1 codec which is MPEG2, any half decent PC sold within the last 1-2 years can play MPEG2 files in HIDEF (1080p). Unfortunately MPEG2 comes at a cost, file size, MPEG2 needs a high bitrate to maintain the quality at them resolutions, being HIDef movies will take anywhere from 10-20GB+ of diskspace.

H264 is one of the new codecs which can be used on HD-DVD/Blueray discs. The codec solves the diskspace issue above by being able to maintain the same quality level of MPEG2, but at a lower bitrate, maybe even 1/3rd of the bitrate.....which means the filesize will be 1/3 of the MPEG2 movie..........unfortunately it takes an obsence amount of generic CPU power to decode at the moment, and at the moment the only GPU that offers hardware acceleration is ATI's new range.

To give you an indication of how much CPU is required......my 3200+A64 with a 7800GT, was still dropping frames in a 720p quicktime movie (maybe 3%) and the CPU was hitting 70-80% just to decode it. Now quick/crap time isnt the fastest decoder available and thinks like COREAVC will allow you to use less CPU, but at the moment the requirements for H264 Hidef is still a lot if you dont have an ATI card.

If you must have your system right now, then i would then i would purchase an ATI card as you know it will be able to hardware accelerate H264 immediately (i think you need the x1600 or above for 1080i, but i could be mistaken), as i have been waiting over 6 months for NVIDIA to get off there ass and give me a H264 acceleration in there drivers. Whatever your decision definetely make sure that the card/system you purchase can accelerate at least up to 1080i.
 
curtisfong said:
I've built a box around the MSI 6150 board. Aspire case, 3200+ cpu, < 50% utilization, xp90 (no fan), good temps.

HD content (incl. 1080p) can be found here and here.

6150 specs here.

No stuttering on the box I built. Looks great on a Sceptre 1080p TV!

No H264 hw deocde yet..Nvidia has promised this in future drivers as previous posters have stated.
Great links to that HD content! I'll have to get a few of those discs when I get my computer. Thanks for the links to that motherboard. I will look into it further.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Isn't the best. The MSI board that I got does NOT have voltage adjustments, but it has everything else. It's running at 10x220, ram at cpu/10. I would research the board a bit before you spend $ if overclocking is important to you. Some boards have DVI, some have component out, some have HD sound, a few have all 3. When Nvidia gets H264 in the drivers one of these months CPU speed will not be that important anyway.
 
curtisfong said:
No stuttering on the box I built. Looks great on a Sceptre 1080p TV!

Hey curtisfong,

Is your Sceptre the X37SV? When did you buy your Sceptre? Are you familiar with the AVSForum discussion of the X37SV - do you have similar issues with your Sceptre? Any pros/cons after using your Sceptre? Was the Westy 37 a contender for you?

The Dell 2405 (1200p) and the Sceptre 37 (1080p) have ALMOST the same resolution. I've calculated that the Sceptre's DPI is about 60% of the Dell's so images/text are MAGNIFIED on the Sceptre (2/3 bigger on the Sceptre). Do you fhink viewing things on the Sceptre is grainy (Not a problem OR can make out the individual pixels and you get used to it and just ignore it)

I'm still straddling the fence on the Sceptre. Tech Support has a firmware fix for some of the 37" issues. The 42" is coming out in March and should not have some of the 37's issues and may have a very tantalizing price. I realize that the 1080p displays are quite recent developments and the "bugs" haven't all been discovered, much less solved. But the ability to use the 1080p displays as a PC monitor and also as a HiDef TV along with the attractive pricing, makes these 1080p displays very tempting.

TIA for any answers...
 
Toytown said:
As of right now, NO NVIDIA cards have any kind of H264 acceleration enabled. Nvidia is promising them in the forceware 85 series of drivers.........If you want HD H264 right now, then you will have to go the ATI route.



Ive seen lord of the rings : return of the king in 1080p, although in respect it could have just been a 1080i that somebody had upconvered........however it didnt seem this way as the quality was excellent with absolutely no signs of interlacing.........although it was about 30GB in size :(

Well, considering the official DVD release is in 480p, then yes... it would have to have been upconverted quite a bit by someone else. It wouldn't have had to have looked interlaced because the official release is already progressive...

Anyway, while H.264 acceleration is currently unsupported by drivers, once it is supported it will greatly help out people with slower CPUs. The guy above who got flawless 1080i H264 with a 6150 board was because he had a really nice processor. I would be wary, however, to get an ATI card for HD acceleration unless you want to crank out the money... it's X1K series and above only, with the X1300 supporting only 480p, the X1600 supporting 720p and the X1800/X1900 supporting 1080i. But that's getting pretty expensive just for that... normally I'm an ATI fan but for hardware accelerated HD video, right now it looks like nVidia if you don't want to spend too much money. Once the drivers come out that unlock the acceleration, it should look pretty nice...
 
Well, considering the official DVD release is in 480p, then yes... it would have to have been upconverted quite a bit by someone else

Lol, i can assure you it definetely was not a unconvert of a 480p DVD, it could have been an upconvert of a 1080i source for sure, but it didnt show any of the normal signs of an interlaced movie, as i dabble quite a bit in HDTV and above formats, i kinda know what to look for to see if a files original source was interlaced or not. Official DVD release, is not the only way of getting your movies now.

normally I'm an ATI fan but for hardware accelerated HD video, right now it looks like nVidia if you don't want to spend too much money. Once the drivers come out that unlock the acceleration, it should look pretty nice..

I agree that NVidia's purevideo offers better quality than ATI's at the moment, but until NVidia give the specs on how they handle there acceleration of H264 i would stick with ATI, at least you know what card you need for what kinda content your going to be watching. Sure if nvidia released some information as to there H264 acceleration, then you could make the decision on what brand and what card based on the your price range.

The only thing im sick of right now is how long its taking Nvidia to get there driver done for H264, i got my card 7800GT almost 6 months ago, one of the reasons i got it, was that i read somewhere a comment by Nvidia saying it would handle H264 fine, with a driver release very soon..........6 months on.........still no realistic timeframe
 
Will the NVidia's drivers provide support up to 1080p?

Unfortunately nobody knows the answer yet, if you look at how ATI have implemented it, you will see that only there high range cards i think can guarantee 1080p decoding of H264 without any dropped frames.........of course Nvidia might be able to achieve 1080p on something as low as a 6200.

Both parties have only just recently announced hardware support for the likes of Intervideo (Nvidia) and Cyberlink (ATI), if you can possibly hold on, i would wait to see which one gives the best performance. If not and your desperate to have H264 decoding right now, i would first get the COREAVC decoder, see how it runs, if its not good enough for you then get an ATI card.

If your can wait it out awhile (as not much official H264 content is released yet), then i would purchase the lowest end Geforce 6 series card which offers purevideo, so that you can enjoy quality MPEG2 and WMV-HD accelerated videos and then upgrade when theres enough content to warrent the move.
 
nVIDIA handles the H.264 decoding the same as the ATI, they showed it at the CES show. http://www.bit-tech.net/news/2006/01/07/nvidia_decode_h264/

Unlike the restriction of H.264 decoding to the X1xxx series of cards from ATI, the H.264 decoding from nVIDIA works on all the PureVideo products, so that includes existing GeForce 6 boards. Yep, I'm supposed to be able to do 1080P decoding of H.264 content with my 6600GT. Doesn't suprise me too much since I can decode a 1080P video at only 35% CPU utilization anyhow.

result.jpg


This is a screen capture of me running the 1080P sample of the Discoverers on the Microsoft HD WMV website. Since it's a proprietary version of MPEG 4, the calculations are not too much different than the standard H.264, so performance should be comparible.

Also to use H.264 on the ATI series of cards, you have to pay for it separately. Here's the link if you need the proof: http://www.atitech.com/technology/H264.html click on download decoder. Takes you here http://www.cyberlink.com/cinema/ati/h264_decoder/enu/index.jsp

Cyberlink also has shown the nVIDIA version of their H.264 codec at CES, see the press release here: http://www.cyberlink.com/eng/press_room/view_960.html

So ATI's current ability relies on a third-party codec, one that is also being provided to nVIDIA. nVIDIA is taking it a step further with their own PureVideo development so there will be more choices beyond the Cyberlink method. ATI will probably follow suit, but it's important to point out that the technology is really not ground breaking in respect to the nVIDIA line.
 
Since it's a proprietary version of MPEG 4, the calculations are not too much different than the standard H.264

Actually there can be a significant difference between the calculations required, once you start enabling CABAC and other options under H264 AVC encoding, you start to see the CPU usage shoot up quite a bit, from what ive seen more than double WMV-HD easily, especially at the higher resolutions and higher bitrates.

Also to use H.264 on the ATI series of cards, you have to pay for it separately.

I didnt know that, i thought it was included with the AVIVO suite of products, and that they were going to offer a directshow decoder, i thought you only had to pay if you wanted to encode/transcode.........oh well :(
 
I'll have to go back and check but NV has said that h.264 decoding is being filtered down all the way to the 6150 chipset (it does have the 6600 Purevideo part on it) but at that low level it will be limited some in what it can do, probably similar to what its at now.

Personally, I would go with an NV card. I don't like the way ATI has scaled their h.264 features per video card compared to NV.
Toytown said:
Unfortunately nobody knows the answer yet, if you look at how ATI have implemented it, you will see that only there high range cards i think can guarantee 1080p decoding of H264 without any dropped frames.........of course Nvidia might be able to achieve 1080p on something as low as a 6200.

Both parties have only just recently announced hardware support for the likes of Intervideo (Nvidia) and Cyberlink (ATI), if you can possibly hold on, i would wait to see which one gives the best performance. If not and your desperate to have H264 decoding right now, i would first get the COREAVC decoder, see how it runs, if its not good enough for you then get an ATI card.

If your can wait it out awhile (as not much official H264 content is released yet), then i would purchase the lowest end Geforce 6 series card which offers purevideo, so that you can enjoy quality MPEG2 and WMV-HD accelerated videos and then upgrade when theres enough content to warrent the move.
 
Personally, I would go with an NV card. I don't like the way ATI has scaled their h.264 features per video card compared to NV

I dont think ATI did this delibrately, im pretty sure that AVIO/Purevideo use the pixel shaders to achieve the acceleration, in doing so its no wonder the higher end cards are able to accelerate more pixels than a lower end card (a combination of more/more optimized pixel shaders).

Although Nvidia have quoted that theres will be compatible with the 6 series of cards, even the low end, i see them in the same situtation, having to use the pixel shaders to achieve the acceleration. More than likely putting them in the same boat as ATI..........whereby for example a 6200 may only be able to offload 420p H264 onto the card, and the rest on the CPU. or something like that.

Its interesting to speculate :), and im sure that the 85 forceware drivers cannot be more than another quarter away, so in time we'll see :), for now the COREAVC drivers seem to be the most optimized way of playing back H264 without any hardware acceleration.
 
Toytown said:
Lol, i can assure you it definetely was not a unconvert of a 480p DVD, it could have been an upconvert of a 1080i source for sure, but it didnt show any of the normal signs of an interlaced movie, as i dabble quite a bit in HDTV and above formats, i kinda know what to look for to see if a files original source was interlaced or not. Official DVD release, is not the only way of getting your movies now.

What other way? Unless you swiped the digital reel from a movie theater, the only way to get it is in 480p. Even then, I believe the format used with digital projectors is 720p. Now, the P means PROGRESSIVE, NOT INTERLACED. So if it was upconverted from 480p to 1080p, then it of course wouldn't look interlaced because it wasn't. 480p is a progressive format. An upconvert from 1080i to 1080p, on the other hand, WOULD look interlaced (and isn't even really an upconvert anyway).

i = interlaced
p = progressive

Toytown said:
Unfortunately nobody knows the answer yet, if you look at how ATI have implemented it, you will see that only there high range cards i think can guarantee 1080p decoding of H264 without any dropped frames.........of course Nvidia might be able to achieve 1080p on something as low as a 6200.

nVidia has said that the lowest card to have full acceleration up to 1080i/p will be the 6600GT.

HighTest said:
Also to use H.264 on the ATI series of cards, you have to pay for it separately. Here's the link if you need the proof: http://www.atitech.com/technology/H264.html click on download decoder. Takes you here http://www.cyberlink.com/cinema/ati/h264_decoder/enu/index.jsp

I believe that software comes with the x1000 series cards. (Can someone who has an X1300+ confirm this?)
 
What other way? Unless you swiped the digital reel from a movie theater, the only way to get it is in 480p

There are a few other ways, i could have got it.

1. I could have grabbed the HDTV version of one of the US movie channels with a HDTV capture card, then encoded the transportstream to H264.

2. I could have downloaded it from many different sources, non legit, filesharing etc.......


Either would have provided me with better than 480p, if you think really think that the only way to get such footage is off a DVD, then you must have being living under a rock for some time. Example

http://privat.bluezone.no/wiak/lotr-rotk-hl/vlcsnap-189545.png not my url though, so could go at any time.
 
Toytown said:
There are a few other ways, i could have got it.

1. I could have grabbed the HDTV version of one of the US movie channels with a HDTV capture card, then encoded the transportstream to H264.

2. I could have downloaded it from many different sources, non legit, filesharing etc.......

3. I could work at a large corporate imaging company who have a turnover of billions, and although not exactly my job there, i could have access to a lot of different high def sources which are being trialled for different digital cinema technology.

Either would have provided me with better than 480p, if you think really think that the only way to get such footage is off a DVD, then you must have being living under a rock for some time.

1. The HDTV version shown on HDTV movie channels is just 480p, unfortunately.
2. And you would have just downloaded the native 480p version... or an upscaled to 1080p version, I suppose...
3. I have no idea what that even means...

Yeah. Unfortunately, LOTR like many other movies wasn't even shot with a high enough resolution camera that you could have a true hi-def format. I believe with the digital movie theaters, they upscaled it to 720p but again, you would have to go steal the reel from the theater in order to get it, and all it would be is upconverted so that it can match the native resolution of the projectors.

I'm not living under a rock, simply saying that getting Lord of the Rings at a true resolution higher than 480p just isn't going to happen... you can get it off of other ways than just DVD, but they all come from a source format of 480p because that's what the movie was shot in.

edit: That picture you gave me the link to is super-blurry and has artifacts all over the place... that's definitely a victim of bad upconversion...
 
I'm not living under a rock, simply saying that getting Lord of the Rings at a true resolution higher than 480p just isn't going to happen... you can get it off of other ways than just DVD, but they all come from a source format of 480p because that's what the movie was shot in.

No movie is ever shot in 480p, do you know what a 480p film would look like if it was played back in a cinema? Believe me, it wouldnt look good. Even my 480p DVD's that are played on my projector at 60+ inchs, look bad, compared to some of the hidef stuff. To give you some understanding, about 5 years ago, i was importing movies frame by frame into some of our systems (from the original film source) at BASE64 resolution (4096x6144), so i can assure you, any movie that has been out in along time can be re-scanned at hidef.

Otherwise if you take what you just said you are basically saying that HD-DVD and BLUERAY are completely useless, as all that has happened is that they have taken the source of the movie (480p in what you believe it was shot in) and upconverted it for us and slapped it on a disc................consumer existing hardware can already do that today.
 
Movies are SHOT at an incredibly high resolution, yes, but the final movie is NOT that resolution. It would take years just to render the special effects, for example, if the movie was rendered at 4096x6144. That's why it doesn't take an incredibly long time to do the SFX - it's only rendered at 740x480.

I never said HD-DVD is useless, although movies that weren't made with HD-DVD in mind will have to have a repass done on special effects (if it has any) because only the raw footage will be available.

Also, could you please show me your side-by-side comparisons? I've watched the Two Towers on HDNet and then played it back on my progressive-scan DVD player and didn't notice one lick of difference...
 
Sheesh, this topic has completely gone off track with the LOTR nonsense. Can we go back to discusing HD cards?

I think the last on topic question was if the Cyberlink decoder was infact bundled with the X1000 series cards and the purchase link is on the website for other purposes?

The poster was waiting for someone to confirm. My opinion and supported by some ATI owners I know, is that you have to purchase the Cyberlink decoder. Now that could be because they pruchased their cards before it was released? So current purchaser needs to confirm if the H.264 is on the CD or needs to be purchased for $14.95 from the ATI weblink.
 
yes please , back on topic. well as of right now, ati has the better hd decoding technology in their cards. but thats not to say nvidia is not going to come out with its own. whether it will be implemented through new hardware or updated software, we can't say. but as of right now ATI has it. the decision for you to make, is to wait for nvidia to come out with offical support of 1080p decoding, or just go with ati.
 
Back
Top