64bit nvidia card?

JVC

[H]ard|Gawd
Joined
Oct 24, 2004
Messages
1,280
nvidia is saying that their next cards going to be 64bit? what does that mean? aren't they beyond that already?
 
64-bit. hmm. Perhaps they are referring to 64-bit color depth? 16 bits per channel (RGBA), perhaps? 18/18

Maximum of 4,294,967,296^2 colors. I think they are up to 10 bits per channel now, or 40-bit color processing.
 
But isn't 64-bit colour depth as per usual now? Just a marketing thing then?
 
nvidia is saying that their next cards going to be 64bit? what does that mean? aren't they beyond that already?
You mean the ones with double precision shaders? That will be nice for GPGPU uses.

Q: Does CUDA support Double Precision Floating Point arithmetic?

A: CUDA supports the C "double" data type. However on G80
(e.g. GeForce 8800) GPUs, these types will get demoted to 32-bit
floats. NVIDIA GPUs supporting double precision in hardware will
become available in late 2007.

TGDaily: http://74.52.174.180/content/view/30988/135/
 
64bit color depth. They call it deep feild in newer hd tv's its something like 1.8 billion colors.
 
64bit color depth. They call it deep feild in newer hd tv's its something like 1.8 billion colors.
lol, read the post one up.

nvidia mentioned last week that future chips coming out later this year will have double precision (64-bit) floating point units.
 
deep color extends to 48bit. A 32bit value for color already would, obviously, net you a 4B color palette. deep color extends beyond the human eyes ability to differentiate.
 
the reference to 64bit refers to CUDA as mentioned above.

specifically the ability to do double precision floating point (64bit) for GPGPU to be truly useful.

later this year
 
Back
Top