I wrote a paper about this (privatization) in community college. It went over their heads but I didn't care.
I'm all for free enterprise, but like a computer, do you want your kernel run as a system service or local user?
I've seen it years ago, actually, before either of these products were out. You're like one of those leg humping dogs dude. You need to chill for being an old guy.
I'm not here to have a pissing contest to prove whose perspective is valid.
Nor am I going to spend time invalidating you. I'm merely using this as a means to channel a frustration at a company. I'm no fanboy but I'd love to use the card I have now because if I have to buy in again it...
I understand for 3x 3d view. Thats cool. But rendering a picture at a giant resolution and split across many screens isnt exactly groundbreaking. Plus you have to have the exact same screens. Is the math too complex to compute a varied resolution rig? What, is the power of CUDA...
Odd the article reads that SLI is used because of the extra DVI ports.
FISHY
Ive paid into this pit of crap for years, what a money grabbing bullshit move nVidia.
2010 doesn't take an eternity to load, its generally faster in its operation than 2007.
The layout is completely different though, its more efficient but some folks don't want to relearn the menus.
They're ripping off Steam community - theyve got a lot of problems for a game that has been around 5 years with a constant stream of play data - I can't beleive they still have this many problems - ick their installer is a POS to boot.
However, even though you may get scewed over by the...
Turn down view distance unless you have 3.4 ghz.
Tested it to death, thats the frequency wow likes at ultra. Otherwise you'll be beating the crap outta your videocard.
There seems to be some grey area between what the actual HT link speed is for a cpu.
By auto settings my HT link is 2.0 ghz. HT 3.0 on AM3 chips is supposed to support 2.6 ghz HT link. The highest I can get it is 2.4 ghz at the default 1.20 HT link voltage.
I tried 2.6 at 1.30 volts...
Well my viewsonic vp finally died :( Spent 2 days trying to get it to come back on but I had to give up on it because I needed to continue with my work.
I opted for the Asus 1080p Widescreen as my brother got the samsung version.
I'm so impressed with this leap in resolution that I've gone...
I've had an issue over time where the monitor would no longer wake from sleep. The monitor would turn off even if I used power scheme - never turn off.
Eventually the screen stopped coming on altogether. Turns out there is a factory reset for viewsonics that is not in the manual.
Hold...
Thats a ground level perspective. Do you think Steve Job's burdens his mind with all the tech poop? No thats the tech monkey's job, CEO only cares about moola. I doubt they're looking that far ahead, they wouldn't resort to this kind of PR just to get someone to develop for their products...
I'm no fanboi of either products.
The newest OSX is very good though, and I actually enjoy it when I have to use it.
Has that always been the case with OSX? Nope, but now its getting to the point to where nearly everything is solid (for the most part) and their biggest problem now is adobe...
Adobe says Apple OSX is closed so they can't write hardware-accel through video.
Apple is saying Adobe software is proprietary so they can't patch around it.
Who should budge?
Hmm its all becoming clear now. Apple is mad at Adobe. OSX gets tied up with flash and the i7 toasts their machines causing crazy support costs. The circle is complete.:D
Why not patch in a cpu throttle down when the temp gets up there?
Source? ;)
just kidding, im not a douche.
But that is just your opinion based on criteria that only you give a shit about.
It doesn't matter what you think, it matters what apple thinks. And im pretty sure AMD is hitting on all the points that they're looking for.