Athlon vs Phenom + 8200/8300 or 780 chipset. Discuss here.

Logan321

[H]ard|Gawd
Joined
Oct 9, 2003
Messages
1,900
Since this seems to keep coming up in other threads, and often aren't really on topic, I thought I'd throw it out in it's own thread.

Issue: Athlon processors, like the low power 4850e/5050e, run at the old HT speed of 1GHz. The Phenom processors, run at the new HT speed of 1.8-2.0GHz.

Theory: Advanced post processing on 1080i film-sourced video is greatly improved by using a Phenom processor instead of an Athlon.

At this time, AMD doesn't offer a low-power Phenom processor, though several are "in the works". So it comes to choice. How much of a difference does this post processing make, and is it worth the added initial cost, heat, and electrical bill?
 
You don't need a Phenom. You need a CPU with HT3.0. But, you are correct, there are no low power CPUs with HT3.0.

Frankly speaking I see this as a good reason to go Intel with a 9300/9400IGP based mobo if you're concerned with low power consumption.
 
One of the posts you linked from avsforum indicated he had issues with HT that he was able to overcome by overclocking his hypertransport 20%. If this is true, does this negate the aforementioned requirement for HT3.0?

If the limiting factor truly is hypertransport, and 1GHz is too slow, and 1.8GHz is fast enough, there must be a point inbetween that breaks even.

I wish I had a copy of HDHQV to do some testing of my own, but I'm not really interested in buying it for my own satisfaction.

It would also be interesting to downclock the hypertransport on a phenom or whatever to 1.0GHz and see what affect that has.
 
One of the posts you linked from avsforum indicated he had issues with HT that he was able to overcome by overclocking his hypertransport 20%. If this is true, does this negate the aforementioned requirement for HT3.0?

If the limiting factor truly is hypertransport, and 1GHz is too slow, and 1.8GHz is fast enough, there must be a point inbetween that breaks even.
Hard to say. That post complained specifically about choppy playback, not necessarily the post processing though. I guess it all depends how the nVidia drivers determine what features get enabled or not. If they do some sort of speed measurement then it might work.
 
Does anyone have screenshots of the difference between HT1.0 and HT3.0 or say exactly what the problem will be? Poor picture quality, tearing, etc? I'm having trouble figuring out the implications of the poor post processing but I might run into the problem when I get my tuner card in the mail tomorrow and start recording 1080i broadcasts.
 
Does anyone have screenshots of the difference between HT1.0 and HT3.0 or say exactly what the problem will be? Poor picture quality, tearing, etc? I'm having trouble figuring out the implications of the poor post processing but I might run into the problem when I get my tuner card in the mail tomorrow and start recording 1080i broadcasts.

Seconded.

I plan on building my Htpc around the Zotac 8200 wifi itx board with a 4850e and a m780 for QAM. I dont ever plan on recording anything, everything I watch will be streamed and aquired by "ways of actions". From what im gathering from various statements is that OTAHD/480i/1080i shows/moives will be choppy and jittery?
 
Before anyone panics, take a deep breath... AFAIK, the only issue with the Athlon + 8200/8300 is with post processing for film sourced (ie recorded in 1080p 24fps) and converted to 1080i 30fps via a process called TELECINE, whichis taking the source film and adding 2 interlaced frames every 5 to "step up" to 30fps video.

IVTC, short for inverse telecine, can be done by the system to display the video in its original 1080p 24fps. This is a memory intensive process that the bandwidth of HT1.0 (1GHz) is insufficient for, apparently.

To actually benefit from this processing, you need to have a 1080p TV capable of 24Hz, I believe. So it's pretty specialized.

There's a decent set of explanations here.
 
Oh... in that case who gives a crap? lol. Sterodude is making it out to be the end of the world.
 
Before anyone panics, take a deep breath... AFAIK, the only issue with the Athlon + 8200/8300 is with post processing for film sourced (ie recorded in 1080p 24fps) and converted to 1080i 30fps via a process called TELECINE, whichis taking the source film and adding 2 interlaced frames every 5 to "step up" to 30fps video.
FWIW, this is most broadcast 1080i content.
To actually benefit from this processing, you need to have a 1080p TV capable of 24Hz, I believe. So it's pretty specialized.
Not true. You can see the result on any TV, even a 1080i TV. I can clearly tell the difference on my Hitachi 1080i RP-CRT TV.
 
Oh... in that case who gives a crap? lol. Sterodude is making it out to be the end of the world.
It's a fairly serious issue if you watch a lot of 1080i TV. The result is visible on any 1080 TV (even interlaced ones).

Edit: Didn't you in the past make sure you had the lowest power stand alone card that supported all the video processing features? I've got a stand alone 8600 in my AMD based HTPC and I certainly won't be downgrading the image quality by moving to an IGP solution. If I can get the same image quality then I'd consider it. For me I need a new mobo and processor to take full advantage of the 7.1 PCM audio while keeping the same image quality I have now. If I'm going to buy a new processor and motherboard I'm going Intel with 9300/9400 due to less power consumption and heat load.
 
Does anyone have screenshots of the difference between HT1.0 and HT3.0 or say exactly what the problem will be? Poor picture quality, tearing, etc? I'm having trouble figuring out the implications of the poor post processing but I might run into the problem when I get my tuner card in the mail tomorrow and start recording 1080i broadcasts.
You effectively will lose half of your vertical resolution (like all deinterlacing schemes) because you will be deinterlacing the content instead of restoring the original progressive frames. Also note that the IVTC is not the only thing missing. The more advanced deinterlacing schemes (Spatial-Temporal De-Interlacing) are also not available.
 
I don't understand the 1Ghz HT has a comparable if not better bandwidth vs. intel 1066 and lower FSBs. Are you saying I would have to have an Intel FSB of 1333 or better (QPI) to get smooth playback? I think it is more related to the Video cards ability or lack of and relying on the CPUs performance more than the bandwidth limitations. Any benchmark will show that AMD CPUs have lower multimedia performance vs Intel especially the X2 line. Phenom/Phenom IIs have made up some ground though it may have something to do with bandwidth, but mor elikely optimizations to the integer pipeline.

http://www.avadirect.com/forum/forum_posts.asp?TID=186

Front Side Bus

Since the FSB is 64-bits and runs between 1066 MT/s and 1600 MT/s, we can effectively determine the bandwidth.

FSB Bandwidth
1066 MHz 8.5 GB/s
1333 MHz 10.6 GB/s
1600 MHz 12.8 GB/s
2000 (OC) MHz 16.0 GB/s

HyperTransport

HT recently has been running from 800 MHz from the low-end processors up to 4.0 GHz on the newest quad core processors. That is taking into account the clock multiplier but not the DDR. Since the bit-width is 32, determining the bandwidth is possible.

HT Bandwidth Bandwidth (duplex)
800 MHz 6.4 GB/s 12.8 GB/s
2000 MHz 16.0 GB/s 32.0 GB/s
3200 MHz 25.4 GB/s 52.8 GB/s
4000 MHz 32.0 GB/s 64.0 GB/s
 
I think the difference is the memory controller and igp on the intel platform are both in the northbridge chip, whereas the memory controller for the amd platform is in the cpu.

While having the memory controller in the cpu is ideal for cpu access, it's less ideal for igp access as the igp has to pass the data through the cpu to reach the ram, effectively doubling the throughput required.

However, this is all just speculation until someone with a phenom downclocks it to 1GHz HT and tests it against an athlon with similarily clocked HT.
 
I'm finding this discussion fascinating. I was stunned reading this in the original post. I am currently planning my HTPC build and am now questioning my CPU/mobo combo. You guys that know a helluva a lot more than me keep up the discussion... I really don't want to build a HTPC that is anything less than perfect playback... especially if it is only $50-100 more.
 
At this point, it seems likely there's a bottleneck somewhere in the athlon +8200/8300 combo that limits the availability of higher level IQ filters in the nvidia drivers. Either that, or there's something fishy going on.

The question I thought of is this... If your TV refreshes at 60Hz, meaning 60 static images a second, and your source video is 30fps, then you get 2 images in a row that are the same, correct? If you IVTC a 1080i video back to 1080p 24Hz, then you should get the following series... aaa bb ccc dd ee and repeating. Isn't that added frame going to make the video viewing funny unless you output at 24Hz to your TV?

As for "perfect playback", you have a few options. There are low power cpus coming down the line based on HT3.0, they're just not here now. You can either a. go the Intel route, and spend more, but get more cpu capability, or b. go the AMD route, and make a choice between having the high level IQ filters right away, though likely more power draw/heat/noise by getting a phenom x3/x4 or 7750 puma cpu, or go with the cheap 4850/5050e processor now, and upgrade to a 45W x2 with HT3.0 later.

Seems like the real question is whether you currently or will soon be recording 1080i broadcasts for playback. As it stands, there is NO ISSUE with 1080P playback on Athlon + 8200/8300.
 
The question I thought of is this... If your TV refreshes at 60Hz, meaning 60 static images a second, and your source video is 30fps, then you get 2 images in a row that are the same, correct? If you IVTC a 1080i video back to 1080p 24Hz, then you should get the following series... aaa bb ccc dd ee and repeating. Isn't that added frame going to make the video viewing funny unless you output at 24Hz to your TV?
That would be judder. That is one of the reasons people are after sets accept 1080p24 and display it at a multiple of 24Hz, like 24Hz, 48Hz, 72Hz, 120Hz.

Even when deinterlaced you will still get judder. The IVTC action doesn't make it any worse.

I will try to use an AVIsynth script later to process a 1080i recording with an IVTC and again with a deinterlacing filter to show the difference in image quality.
 
I'm finding this discussion fascinating. I was stunned reading this in the original post. I am currently planning my HTPC build and am now questioning my CPU/mobo combo. You guys that know a helluva a lot more than me keep up the discussion... I really don't want to build a HTPC that is anything less than perfect playback... especially if it is only $50-100 more.
There are some options depending on what you've bought already if you're after "perfect". If you haven't bought a CPU or mobo, I say go the Intel route. If you already have a CPU and motherboard, you can either get a 7750 (with HT3.0) or add in a stand alone graphics card.
 
So I currently am building a HTPC, posted a few threads about it. Planning on going with a Nvidia 8200 or 8300 board (depending on whats available). I plan on using a 5050e for the CPU. My only concern (and this thread might not apply to me) is that I plan on recording OTA DTV using two 2250 tuners. Would I experience any of the problems you are referring to? I have no plans to rip DVD's to the computer anytime, or Blu-Ray's. I simply want to ensure that if I use the tuners and VMC to record TV using the best settings that I wont see any of these issues.

Would it be worth it to upgrade to the Kuma now or keep the 5050e and upgrade at a later date?
 
So I currently am building a HTPC, posted a few threads about it. Planning on going with a Nvidia 8200 or 8300 board (depending on whats available). I plan on using a 5050e for the CPU. My only concern (and this thread might not apply to me) is that I plan on recording OTA DTV using two 2250 tuners. Would I experience any of the problems you are referring to? I have no plans to rip DVD's to the computer anytime, or Blu-Ray's. I simply want to ensure that if I use the tuners and VMC to record TV using the best settings that I wont see any of these issues.

Would it be worth it to upgrade to the Kuma now or keep the 5050e and upgrade at a later date?

In short, yes, because the 2250 can record in 1080i, you will most likely want to be using some form of post-processing for playback. Kuma is more expensive, but only a little, and more heat and power draw, but supports all post processing.

Of course, your other option is to go the nVidia+Intel route with a e5200 + 9300/9400.
 
In short, yes, because the 2250 can record in 1080i, you will most likely want to be using some form of post-processing for playback. Kuma is more expensive, but only a little, and more heat and power draw, but supports all post processing.

Of course, your other option is to go the nVidia+Intel route with a e5200 + 9300/9400.

This is a stupid question (I have many of those :() but where would I find out if my current tuner is recording in 1080i v 1080p? I use VMC. I don't think I have that issue with the build in my sig (damn well better not with over 1k in the processor and gpu :))

So in your opinion is it better to go with the Kuma now correct? My build can handle it (both the power draw and the expense) just wanting to verify. Do you think the Noctuna 92mm CPU cooler would be enough to cool it?
 
I believe some OTA channels are in 1080i. I don't know if it's IVTC though.
So I would like to know this answer too!
 
http://www.hauppauge.com/site/products/data_hvr2250.html

Supports all ATSC formats, up to the highest definition 1080i format! :p

Manufacturer's website is usually a good place to start. Yes, either Kuma, or maybe even Phenom x3, since it's the same power envelope either way (95W) and you can get a 2.3GHz x3 for ~$25 more than the 2.7GHz x2 Kuma.

I wish I'd known this before I went to a 4850e + 8200 mobo. :(
 
http://www.hauppauge.com/site/products/data_hvr2250.html

Supports all ATSC formats, up to the highest definition 1080i format! :p

Manufacturer's website is usually a good place to start. Yes, either Kuma, or maybe even Phenom x3, since it's the same power envelope either way (95W) and you can get a 2.3GHz x3 for ~$25 more than the 2.7GHz x2 Kuma.

I wish I'd known this before I went to a 4850e + 8200 mobo. :(

Suddenly I am soooo glad I didnt pull the trigger on my HTPC build just yet...may wait even longer to see what happens on this thread and learn from others mistakes...rather than my own (which tend to be more expensive:D)
 
My only concern (and this thread might not apply to me) is that I plan on recording OTA DTV using two 2250 tuners.
It does apply All OTA HD is 1080i except for FOX and ABC. NBC, CW, PBS, & CBS are all 1080. Most shows are shot 1080p24 and then turned into 1080i60 through a telecine process. And those that aren't can still benefit from the superior deinterlacing offered by a HT3.0 solution.

Your card will record the native transport file that's being broadcast. Meaning that you will get 1080i.
 
Looks like the cpu to watch for is the Phenom X3 8450e (2.1GHz / 65W) when the price drops... there's no good reason for it to cost more than a 8650 but it does, apparently.
 
Well, I guess when my 2250 comes in tomorrow I'll test it with my computer at stock settings and then see if there is a difference with HT overclocked like the person on avsforum suggested.
 
:rolleyes: Not a big issue when used with an 8000/9000 series board. AMD's hardware adds more post processing for video (to render it, looks nearly similar to NV's picture quality) which is why it actually benefits from HT 3.0.

Stereodude makes it sound like the end of the world when it doesn't matter much; he points to HQV benchmarks as proof that HT 3.0 makes a difference when HQV is about as great a benchmark as 3dmark is in figuring out how "badass" your video card is.

HQV is completely subjective and the worst part is that no one posts pictures or videos of "issues" they see and have rated lower. Completely subjective. That's like writing a video card review completely based on "how smooth it feels" without actually posting any benchmark info screenshots anything.

The 780G benefits more from HT 3.0 simply cause it's running more post processing on video. This does matter, but not all that much, when it comes to OTA HD sources that broadcast in MPEG2 but it's still not a big deal because these signals are broadcasted... You know what? Whatever. This is entirely stupid and has been a known "issue" since the 780G dropped. This is entirely old and isn't really a problem as much due to recent driver work over the last year or two.

The point is theres very few situations now where this problem actually makes a difference and from the sounds of it Stereodude is in one of them (he's using an RPTV)
 
I plan on building my Htpc around the Zotac 8200 wifi itx board with a 4850e and a m780 for QAM. I dont ever plan on recording anything, everything I watch will be streamed and aquired by "ways of actions". From what im gathering from various statements is that OTAHD/480i/1080i shows/moives will be choppy and jittery?
You'll be fine with that combo.
 
It would also be interesting to downclock the hypertransport on a phenom or whatever to 1.0GHz and see what affect that has.
From friends of mine who've actually done just that they've seen no differences what so ever with an 8200 over an 8300.
 
I've been trying to references to IQ that aren't entirely based on HD HQV benchmark values, and as of yet, haven't found any. I would also like to see the Phenom's HT lowered to 1GHz and a 4850/5050e's HT raised to 1.2 or 1.25GHz to see if that affects that benchmark. Of course, since there's no free or trial version of it, I don't especially feel like spending money on it anymore than I've ever felt like spending money on futuremark or sisoft benchmark products.
 
:rolleyes: Not a big issue when used with an 8000/9000 series board. AMD's hardware adds more post processing for video (to render it, looks nearly similar to NV's picture quality) which is why it actually benefits from HT 3.0.

Stereodude makes it sound like the end of the world when it doesn't matter much; he points to HQV benchmarks as proof that HT 3.0 makes a difference when HQV is about as great a benchmark as 3dmark is in figuring out how "badass" your video card is.

HQV is completely subjective and the worst part is that no one posts pictures or videos of "issues" they see and have rated lower. Completely subjective. That's like writing a video card review completely based on "how smooth it feels" without actually posting any benchmark info screenshots anything.

The 780G benefits more from HT 3.0 simply cause it's running more post processing on video. This does matter, but not all that much, when it comes to OTA HD sources that broadcast in MPEG2 but it's still not a big deal because these signals are broadcasted... You know what? Whatever. This is entirely stupid and has been a known "issue" since the 780G dropped. This is entirely old and isn't really a problem as much due to recent driver work over the last year or two.

The point is theres very few situations now where this problem actually makes a difference and from the sounds of it Stereodude is in one of them (he's using an RPTV)
Denial's not just a river in Egypt anymore...
 
I've been trying to references to IQ that aren't entirely based on HD HQV benchmark values, and as of yet, haven't found any. I would also like to see the Phenom's HT lowered to 1GHz and a 4850/5050e's HT raised to 1.2 or 1.25GHz to see if that affects that benchmark. Of course, since there's no free or trial version of it, I don't especially feel like spending money on it anymore than I've ever felt like spending money on futuremark or sisoft benchmark products.

Overclocking the HT use to get rid of some issues where you'd see stutter/judder but that back when the 780G came out and NV had their 6150 (the 6150 benefited greatly from OCing the HT, some cases it would be completely useless for HD content without it).

If you see no problems and have no problems then why is that a problem for you?? Like I said, Stereodude is in one of the very few situations were this actually becomes an issue. He's got an RPTV which needs to do an additional analog to digital process (and then maybe another if he's using an analog connection from his HTPC) all of which makes this issue actually visable on digital/interlaced content. If you've got a pure digital connection/source then it doesn't matter and you're not going to see it.
 
Denial's not just a river in Egypt anymore...
And the sky isn't falling. This is hardly an issue for anyone with a pure digital setup. You can't prove you're right and you can't say that your experience is the standard. There's a reason you had/have this issue. I'm not saying it doesn't exist, I am saying it doesn't matter to the vast majority of people here.
 
Is there a [H]ard|OCP guy who does htpcs? Be nice to have this explored a little more in depth before any great conclusions are based primarily on the results of a benchmark really not designed for HTPCs but for standalone players and primarily a tool for selling product. http://www.hqv.com/products.cfm

Anyone else smell conflict of interest?
 
Is there a [H]ard|OCP guy who does htpcs? Be nice to have this explored a little more in depth before any great conclusions are based primarily on the results of a benchmark really not designed for HTPCs but for standalone players and is primarily a tool for selling product. http://www.hqv.com/products.cfm

Anyone else smell conflict of interest?
Well, the problem is other then actually building the damn thing and playing around with it you won't know if something is wrong with it and there really isn't a way to "benchmark" image quality on an HTPC because of how subjective it is.
 
You can do side by side comparisons of screenshots... HOCP does that for video card reviews and screenshots of games, so why not? If there's clarity issues, you'd be able to tell, right?
 
You can do side by side comparisons of screenshots... HOCP does that for video card reviews and screenshots of games, so why not? If there's clarity issues, you'd be able to tell, right?

This whole discussion makes me think of this review and why most tech sites do terrible HTPC related reviews: http://techreport.com/articles.x/8208/10

There where so many things wrong with that review; (Hauppauge) drivers that were used in the review that had been pulled weeks before the review hit from the manufacture's site due to lower performance (check the screenshots). The "winner" was one of the worst tuners made as well and known for it's crappy image quality.
 
Back
Top