At the last page of Techspot's crysis performance review, they posted some 32bit vs. 64 bit figures for crysis:
http://http://www.techspot.com/article/73-crysis-performance/page8.html
GeForce 8800 GTX (768MB) 32-bit
1440x900 = 38.6fps
1680x1050 = 30.8fps
1920x1200 = 25.8fps
GeForce 8800 GTX (768MB) 64-bit
1440x900 = 45.6fps
1680x1050 = 36.4fps
1920x1200 = 30.0fps
I haven't seen any other info on the comparison except that site, so I was wondering if anyone has any supporting or unsupporting evidence of this.
http://http://www.techspot.com/article/73-crysis-performance/page8.html
GeForce 8800 GTX (768MB) 32-bit
1440x900 = 38.6fps
1680x1050 = 30.8fps
1920x1200 = 25.8fps
GeForce 8800 GTX (768MB) 64-bit
1440x900 = 45.6fps
1680x1050 = 36.4fps
1920x1200 = 30.0fps
I haven't seen any other info on the comparison except that site, so I was wondering if anyone has any supporting or unsupporting evidence of this.