Do you want to run multi monitors? Go nVidia. Their setup just seems more natural than ATIs.
Do you want to run an OS other than windows? Go nVidia. They actually have support for them (do ATI support tv-out in linux yet?).
Other than that there is nothing it.
No chance, ATI drivers are horrible on Linux. For a start, TV out still isn't working on it...
You're going to have to go nVidia if you want Linux support.
Disagree - per pipe it's even or slightly in favour of the GTX, per watt it's in favour of the GTX. ATI have gotten away with the 16 pipes because they have boosted the clock speed - and it's worked, but it doesn't mean that they are more efficient.
That is exactly what a hard launch is :rolleyes:. A hard launch is, the day it is launched it is available. This is what nVidia did with the Geforce 7 series.
No, I just can't be bothred to put the money, time and effort into swapping it over - my GTX gives decent enough performance, it's just not worth it for me to swap.
You people do realise that the GTX is equipped for 512mb as well - look at the PCB - next to every memory module is the pad for another - if 512mb X1800s are slaughtering 256mb GTXs they can start manufacturing 512mb GTXs easily enough.
Ok, I'm installing windows on my new computer now, it has a XFX 7800GTX, 4400+ X2, Gigabytr K8MNF-9, Antec Smartpower 2.0 500W and 1GB (2x512MB) of Geil RAM (2-5-2-2 is its stock settings). Also two Seagate 120GB SATA drives working off the onboard SATA controller.
When installing windows, at...
At the moment, they are all probably coming out of the same plant (the PCBs that is) - thus they will all be the same. As the supply progresses, there will be more variety. I've seen red,black and blue nVidia PCBs, it's just at the moment they are all the same, just branded differently.
I'm sick of people blowing it out of proportion. Have your opinion if you want - but don't blow it out of proportion. It's a MINOR image fault, in a fast developing industry. I can tolerate minor faults - do you seriously think nVidia sits back in their labs saying
Hey guys, lets see how we...
Worrying - either they are using memory speed to compensate for a slow core or the core is ridiculously fast and needs ridiculously fast memory to supply it.
Either way, using top line memory is worrying for price.
What are you smoking? Tribes was only slow for those who didn't know how to play it - I had a couple of memorable games where I capped out a map (8 caps) in less than five minutes.
The problem with Tribes was that for the most it wasn't as easily accessible as CS - you actually had to learn...
The first release of CS was what June 1999 (someone want to confirm this for me?, I'm pretty sure it's correct though).
The Geforce 256 was released from memory in August 1999. How it played though - I can't remmber :p.
And you know what you're mising? They disabled ALL optimisations. If you disable ALL optimisations on an ATi card you'll take a similar performance hit. If you disable just the optimisation that causes issues, then the performance hit will be much less.
November 29 2004 - Given that the 7800GTX has been specifically aimed at higher resolutions than 1600x1200 I would guess that they have overhauled it,.
Well it's not a button press - but yes it can be done. From memory it's
Menu - Navigate to input (this will only need to be done once) - Enter - Move input from one to the other - Enter
Hopefully not return to the forums - after doing that may I suggest building a bridge and getting over it - it's a minor fault which you never noticed until it was pointed out - stop building it into something it isn't.
Probably because supply is extremely limited?
Your logic would make sense if to create a dual core processor, all you did was bolt two single cores together. This is not what happens, so your logic is not applicable.
EDIT: To elaborate on this, let's take a hypothetical example.
Every...
Bah, you seem to be taking it at face value, and you've accepted it. But yes, I used the wrong choice of words. In future readings, change "saying" to "accepting".
Ok, you can take it however you want. It's as logical as saying it's a point against Kimberly Securities Limited . The point is...
The law would disagree wtih you there. The nemo dat rule says that you can only pass on as good a title as you have. The person who you bought it from cannot have had indefeasible title (as Intel retains this), and so cannot give you that title.
I'd try and get somethign out of it, but if it...
Let me get this right... You are saying that if you overclock an AMD processor, it will get to the same heat and power consumption as a stock Intel processor. And you are saying this is a point against AMD? That's just crazy logic.
You know I really don't care about the graphics. I'm happy with 2000 graphics. I get wowed by 2005 graphics, but I don't need them and honestly don't care that much about them.
You know why this is?
Because I remember 1990 graphics.
Does anyone know if the HSI heatsink is seperate to that of the main heatsink? I'm curious as if I get one, I want to be able to swap the main heatsink for a aftermark solution if need be, but the HSI heatsink needs to be seperate for that to be possible.
An image of it can be found here, but...
Utter rubbish. ATI released products, when they knew they could not meet demand. This led to increased marking up. ATI is directly at fault, as they have supply issues.
I would agree, except Nvidia isn't concentrating more resources towards it at the moment, whereas ATI is devoting significant resources to getting crossfire up and running...
While I would like to, that's not really feasible, as I need to give my current Ti4600 to my parents, and a 6600gt is the top level graphics card I can afford, and I kinda want a graphics card in my computer :p.
The question is, will I be significantly held back by the processor, if running...
Ok, I know it's getting on the old side now, but will a Barton 3000+ (not overclocked), coupled with a 6600GT and 1GB of generic RAM be enough to cope with all new games, at 1280x1024?
If not, what would be the minimum AMD processor I can get away with to allow me to run at at 1280x1024?, it...