4200+ or ??

gaspah

2[H]4U
Joined
May 29, 2007
Messages
2,373
I'm gonna build my mum a htpc because she has a huge hd plasma but no hd content... and right now its so cheap...this rig is only a few hundred..

matx aluminium cube
ga-ma78gm-s2h
4x1GiB corsair ddr2-800 <--- had this wasting away doing nothing on a desk... :confused::eek:
athlon x2 4200+
500gb samsung
samsung dvd
ms wireless kbm


now i'm going to test if h264@1080P plays before i give it to them and have access to 5200 6000 x3-8450 x4-9500 at work to swap to.. but i doubt i will have any issues there.

the thing is my little brother will probably want to spark up the graphics later and get it gaming... am i wasting money putting that 4200+ in there... is that gonna keep up with some of the latest games... will it do bioshock or cod4 @ 1080p with decent settings say if it had a had like a 3850/70 or 8800gt.

i wanted to keep the clock lower for needless heat and wanted to go with a lower processor (and my cheap ass wanted to go without spending needless money too)..

definately staying X2
 
Techreport actually tested the 780G with that IDENTICAL board and it shows only using 50% CPU usage for AVC (aka H.264) 1080p Blu-Ray playback @ 1920x1440 with a X2 4850e. The X2 4200+ should have no problem with playback, but if you're adding a discrete graphics card for COD4/Bioshock then 8800GT or 3870 should further reduce the CPU utilization like the HD3450 in the Techreport test cutting it to 36%
 
sell me the x2 4200 for 35 shipped, that's all i'm going to say =X

Techreport actually tested the 780G with that IDENTICAL board and it shows only using 50% CPU usage for AVC (aka H.264) 1080p Blu-Ray playback @ 1920x1440 with a X2 4850e. The X2 4200+ should have no problem with playback, but if you're adding a discrete graphics card for COD4/Bioshock then 8800GT or 3870 should further reduce the CPU utilization like the HD3450 in the Techreport test cutting it to 36%

so you both think i'm on the money then... thanks for the input. :D
 
If the system will eventually be used for games, you'll want at least a 2.5 - 2.8 GHz X2 in there. For everything else, a 4200+ will be more than enough.
 
Also keep in mind that board is hybrid crossfire capable. So may save a few $. Do your homework tho.
 
I have a 2.6 GHz dual core, it has no problems with H.264 and 1080p videos - as long as you use a good build of ffdshow. I can't imagine a minor clockspeed deficit would hurt that much.
 
What type of Plasma is it? Reason I ask is only a few plasma's are true 1080p, and the ones that are cost some bucks. Even a card like the 8800GT will struggle in many games at 1080p resolutions. You may be better off at 720p, especially if the TV isn't capable of full 1920x1080 resulution.

My HTPC consists of a X2 3600+ @ 2.6GHz and a x1900xtx video card and has no problems with H264 video's @ 1080p
 
yea the 5000BE is the way to go if you buying one. the 4200 is not a overclocker and is a midline proccessor. i upgraded from a 4200 proccessor. good one but not the best choice!

5000BE is the way or maybe a 4000 their good to!

FLAKE
 
Even a card like the 8800GT will struggle in many games at 1080p resolutions. You may be better off at 720p, especially if the TV isn't capable of full 1920x1080 resulution.
Crysis maybe? I run a 7800 GTX and play CoD4, CS:Source, TF2, and BioShock at 1920x1200, though not on max settings. An 8800 GT would probably eat all those games for breakfast at that resolution, except Crysis.
 
I own an 8800GT. Crysis is not the only game that it would struggle with. Assassins Creed, and even Oblivion at that high of a res is going to push the card pretty hard, especially at max settings with AA/AF.

Just go look at the charts over at toms. An overclocked 8800GT averaged 30fps on Oblivion at 1920x1280 max settings with no AA or AF and I have no doubt a game like Assassins Creed would easily dip below 30fps average at those same settings.

Sure you can turn the settings down, but if the TV isn't capable of producing a true 1080p picture, might as well do 720p, otherwise you're just wasting GPU processing power to render pixels you'll never see, which is why I inquired as to the model.
 
Back
Top