Bit-tech 5770 Preview

Karndog

Limp Gawd
Joined
Apr 14, 2006
Messages
179
Last edited:
Interesting, I'm curious to see actual game performance test to see if it is closer to the 4890 or closer to the 4870...
 
If Fudzilla's guess of 60% performance boost over the HD 4770 is in the ballpark, then the HD 5770 should be a little slower than the HD 4870. However, the price point will determine if this will be a good bang for your buck. If set too high, then there you might as well step up to the HD 5850.

Price wise, I would think that it will priced between the HD 4870 and HD 4890. That's primarily due to HD 4870-ish performance with a significant decrease in power consumption.
 
^ those performance estimates are low

I think you guys will be pleasantly surprised

For some reason, I think that 5770 might offer us GTX 275 performance on the cheap. A sub $200 card that might be able to run at 1920X1080 if you can live without high amounts of AA seems like a pure winner.
 
Other reviews show the 5750 running slightly faster then the 4870, I imagine the 5770 should be faster then the 4890. Plus, these are all beta drivers.
 
Other reviews show the 5750 running slightly faster then the 4870, I imagine the 5770 should be faster then the 4890. Plus, these are all beta drivers.

Exactly what I am hoping for because all I need is a 4870 to play what I play at my resolution so the 5750 should be fine for me, and the power consumption is exactly what I was looking for at that performance level.

^ those performance estimates are low

I think you guys will be pleasantly surprised

Oh you tease...! :eek::p
 
If Fudzilla's guess of 60% performance boost over the HD 4770 is in the ballpark, then the HD 5770 should be a little slower than the HD 4870.

No it means it would be faster and better than a 4870. Speed wise it should be faster than HD4890 when over clocking. Due to the awesome nature of it's 40nm design it will scale better than HD4890 when over clocking CPU. Xfire performance will be mega kill. Then it will have big power in the shaders with Direct X11 magic.

It's going to be legendary. :cool:
 
^ those performance estimates are low

I think you guys will be pleasantly surprised

How, exactly? The 5770 has, from what I've read, a 128-bit bus. Unless we see a serious increase in GDDR5 memory speeds (e.g. like the 1.2 GHz speed on the 5870), the card will be bandwidth-constrained at 4890+ speeds.

Since I doubt they'll waste top-end DDR5 memory on a midrange card, I foresee maybe 1GHz tops. About the only thing we can hope for is 4870 speeds.
 
How, exactly? The 5770 has, from what I've read, a 128-bit bus. Unless we see a serious increase in GDDR5 memory speeds (e.g. like the 1.2 GHz speed on the 5870), the card will be bandwidth-constrained at 4890+ speeds.

Since I doubt they'll waste top-end DDR5 memory on a midrange card, I foresee maybe 1GHz tops. About the only thing we can hope for is 4870 speeds.

Memory is supposedly at 1150 according to Fud.

http://www.fudzilla.com/content/view/15838/1/

And really, you're questioning a guy who has the cards in hand and is actively testing them?
 
How, exactly? The 5770 has, from what I've read, a 128-bit bus. Unless we see a serious increase in GDDR5 memory speeds (e.g. like the 1.2 GHz speed on the 5870), the card will be bandwidth-constrained at 4890+ speeds.

Since I doubt they'll waste top-end DDR5 memory on a midrange card, I foresee maybe 1GHz tops. About the only thing we can hope for is 4870 speeds.

How the fuck can you be doubting the guy who has the card in his hand and is currently or if not done already testing it.
 
How, exactly? The 5770 has, from what I've read, a 128-bit bus. Unless we see a serious increase in GDDR5 memory speeds (e.g. like the 1.2 GHz speed on the 5870), the card will be bandwidth-constrained at 4890+ speeds.

Since I doubt they'll waste top-end DDR5 memory on a midrange card, I foresee maybe 1GHz tops. About the only thing we can hope for is 4870 speeds.

I don't even know how to respond anymore to this idea that the memory bit depth is so important, like that one spec makes or breaks the card, or holds any real value now that GDDR5 is in use. Perhaps it is ignorance? I don't know, hopefully I can make it clear again in my article (which I've done in so many past articles). I feel like I'm repeating myself over and over and over and over and over....
 
I don't even know how to respond anymore to this idea that the memory bit depth is so important, like that one spec makes or breaks the card, or holds any real value now that GDDR5 is in use. Perhaps it is ignorance? I don't know, hopefully I can make it clear again in my article (which I've done in so many past articles). I feel like I'm repeating myself over and over and over and over and over....

hey - we try to bug honest users via flame wars / ignorance :)
 
All I know is how the cards perform in games, and I'm impressed with the results. It was better than I thought it would be. Can't really say anymore than that until the launch next week. Stay tuned to [H] for all your gaming hardware goodness, I've got a lot of neat things planned to test this month.
 
All I know is how the cards perform in games, and I'm impressed with the results. It was better than I thought it would be. Can't really say anymore than that until the launch next week. Stay tuned to [H] for all your gaming hardware goodness, I've got a lot of neat things planned to test this month.

Stop it! You are making me more anxious than I want to be haha.... BTW can you just tell us how long the card is?
 
Im really looking into the 5770 and 5750, like other guy here, it would be a huge step from Radeon 8500 (4years ago)->>X1600PRO (months ago)->>>785G (weeks ago)->>>HD5700 (future) xD
 
Stop it! You are making me more anxious than I want to be haha.... BTW can you just tell us how long the card is?

I'd like to know this as well. I want to put one of these paired with something like an Athlon X4 into a SG05/ITX build. I'm assuming it's shorter than 9.5".
 
I don't even know how to respond anymore to this idea that the memory bit depth is so important, like that one spec makes or breaks the card, or holds any real value now that GDDR5 is in use. Perhaps it is ignorance? I don't know, hopefully I can make it clear again in my article (which I've done in so many past articles). I feel like I'm repeating myself over and over and over and over and over....

Well, I doubt the people that are spewing the bit rate thing honestly read these articles, they probably remember back when the original Geforce came out with the higher bit rate and the card was AMAZINGLY OMG SO FAST, no mention of the new architecture.

so yea, get prepared to re-repeat yourself (possibly to the same people!) many more times as they won't read up on it.
 
All I know is how the cards perform in games, and I'm impressed with the results. It was better than I thought it would be. Can't really say anymore than that until the launch next week. Stay tuned to [H] for all your gaming hardware goodness, I've got a lot of neat things planned to test this month.

Spill the beans already! :D

Looking forward to the review. What kind of goodness is in the pipeline? :D Borderlands performance?
 
4870 ard 4890 is also use DDR5but have 256bit bus this is why ppl suspecious about 5770 having 4890 level performance ,nothing personall :D .
I don't even know how to respond anymore to this idea that the memory bit depth is so important, like that one spec makes or breaks the card, or holds any real value now that GDDR5 is in use. Perhaps it is ignorance? I don't know, hopefully I can make it clear again in my article (which I've done in so many past articles). I feel like I'm repeating myself over and over and over and over and over....
 
so, ATI is ready to annihilate nvidia's current offering from top to bottom? lol
 
§kynet;1034726441 said:
Nope. Nvidia has PhysX, which makes their $129 cards faster than a $399 ATI card. Nvidia said so.

My HTPC is dying to get that $159 "Slower Card" .

LOL :p
 
I don't even know how to respond anymore to this idea that the memory bit depth is so important, like that one spec makes or breaks the card, or holds any real value now that GDDR5 is in use. Perhaps it is ignorance? I don't know, hopefully I can make it clear again in my article (which I've done in so many past articles). I feel like I'm repeating myself over and over and over and over and over....

its not a hard concept to grasp. With the new architecture bit depth isnt as important as it was, and the card doesnt need lots of bandwith to perform well. Kind of like how amd showed us that clock speed isnt the be all end all of cpu performance with the athlon.
That was a good thing and so is this, smaller bit depth with comparable performance = cards that cost less to produce, and were already seeing those savings with this series.
 
800 Shaders, 850MHz? So it is a DX11 capable HD 4890 with a little less memory bandwidth. I don't think I need a review to guess the performance of the card.
 
800 Shaders, 850MHz? So it is a DX11 capable HD 4890 with a little less memory bandwidth. I don't think I need a review to guess the performance of the card.

/facepalm


I'd like to know this as well. I want to put one of these paired with something like an Athlon X4 into a SG05/ITX build. I'm assuming it's shorter than 9.5".

From the pictures compared to the HD5850 I would make a wild guess and say it is about 8.5" long.
 
Back
Top