3ds Max Rendering - Best Card?

HOCP4ME

2[H]4U
Joined
Jul 1, 2005
Messages
2,959
Okay, I've been assigned to build a $1,000 workstation for 3ds Max rendering and am wondering what video card I should put in it. No, we cannot afford a Quadro :( . After a Q9550 and 8GB of RAM, I have exactly $260 left in the budget for the video card.

What card should I buy? I'm not sure exactly how 3ds Max uses the video card. Does it like processing power or a large amount of VRAM? I've also heard that consumer video cards can be soft-modded to workstation cards, which makes the work better in applications like these. Could/should I do this?

Sorry if I sound like a n00b; I've only built gaming computers before. 3D rendering is a requirement I've never tackled in the past.

Oh, and the card I buy should preferably be available at NewEgg. I need to buy and build this system by the end of this week.

Thanks guys. :)
 
Well if 260 is your max, might as well just go with this: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130400

But you're asking for possible glitches by going that route, and nothing is done support wise for any Geforce/Radeon issues you'll have with an Autodesk program if they arise. And the soft modding sounds good, but I just see a lot of people just essentially rename their cards and thats all. Or actually run the benchmarks and not see much. It also takes older cards as well to do most of those soft mods, the newer ones aren't capable of it, so you're possibly getting lesser specs/shader models/ect by going that route too.

I'd honestly see what wiggle room or alternate methods you have to grab a mid level Quadro off of ebay for 250-350. But if the moneys too tight or that kind of avenue is a no go, its understandable.
 
Well if 260 is your max, might as well just go with this: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130400

But you're asking for possible glitches by going that route, and nothing is done support wise for any Geforce/Radeon issues you'll have with an Autodesk program if they arise. And the soft modding sounds good, but I just see a lot of people just essentially rename their cards and thats all. Or actually run the benchmarks and not see much. It also takes older cards as well to do most of those soft mods, the newer ones aren't capable of it, so you're possibly getting lesser specs/shader models/ect by going that route too.

I'd honestly see what wiggle room or alternate methods you have to grab a mid level Quadro off of ebay for 250-350. But if the moneys too tight or that kind of avenue is a no go, its understandable.

Right now we're rendering on a computer bought from Best Buy with an I-don't-even-want-to-know graphics card in it. I think we'll be very happy with a GTX 260 ;) .

Thanks for the advice.
 
If this is a rendering workstation, maybe go easy on the CPU and buy the biggest and most badass gpu you can get? 9800GX2 ? Or wait a little for the GTX295?
 
Regardless of what card you get whoever you are building the computer for will notice a huge difference with new CPU and RAM. Something that Max does that other programs don't do well is scale with more cores. I just recently built a new computer (see sig) and have noticed my old renders go from around 4 hours or more to twenty minutes with my new hardware.

While getting a Quadro is best, realistically the line between the higher end graphics cards and the workstation cards is closing. The workstation cards GPU's are just optimized for rendering rather than gaming. All the new ATI FireGL cards are based off the RV7700 (my cards gpu) while the Quadro's are based off the GPU's from Nvidia's 200 series. They do outperform the regular graphics cards but you have to do a cost/benefit ratio to really decide. For the most part I can let my longer renders go all night while I sleep and in doing so the time I would save with a faster card may not matter as much as the cost difference. By managing when and how you render you negate the time savings

For your price it makes no sense to try to go with any workstation cards as on newegg you are only looking at around 256Mb of memory when with a regular graphics card you can get upwards of 896.

Go with the card listed by maverick and you will be fine.
 
You should consider dropping the Q9550 down to a lower end quad core like a Q6600, and put the savings toward a better video card, unless the Q9550 has some huge benefits over a Q6600 in 3ds Max.
 
farming (network rendering) is the best way to go for faster rendering, so if you have extra computers lying around, that could help out in the long run.
 
Network rendering (with Max) is not possible unless all computers have the same version of Max as well as the OS. I ran into this problem when wanting to setup a render farm with 2 XP machines and a Vista machine. Backburner (3dMax's netrender program) is also overall a bitch to setup and unless you really have some decent machines isnt worth doing IMHO. IMO it is better to simply have one dedicated machine that you can put your money into than have to keep updating three different ones.
 
I used to support 3D Studio Max and spec out 3DS Max workstations as part of my old job so I'll chime in here.

If this is a rendering workstation, maybe go easy on the CPU and buy the biggest and most badass gpu you can get? 9800GX2 ? Or wait a little for the GTX295?

No, do not go easy on the CPU. You need CPU and RAM. There is a common misconception that the video card is actually used for rendering. The truth is, it doesn't do that at all. The GPU is used for rendering the viewport in the workspace. Like any 3D applicaiton the more complex the 3D work is the more powerful the video card will need to be in order for things to be rendered in the viewport in real time or close to it. When it comes to actually rendering frames for output (finished product) it is 100% CPU and RAM. The rest is meaningless. That is why render farms are essentially headless workstations much of the time.

farming (network rendering) is the best way to go for faster rendering, so if you have extra computers lying around, that could help out in the long run.

It is, but only if they are powerful enough or you have a shit load of them.

Network rendering (with Max) is not possible unless all computers have the same version of Max as well as the OS. I ran into this problem when wanting to setup a render farm with 2 XP machines and a Vista machine. Backburner (3dMax's netrender program) is also overall a bitch to setup and unless you really have some decent machines isnt worth doing IMHO. IMO it is better to simply have one dedicated machine that you can put your money into than have to keep updating three different ones.

They do need the same version of 3DS Max, as well as the same plug-ins if you are using any. However, installations for the purpose of network rendering are FREE. So you can do this as much as you want to. I've never had a problem with using different OSes for this purpose, but I've only done this between Windows 2000 and Windows XP which worked fine. As for setting it up, it really isn't all that challenging.

Okay, I've been assigned to build a $1,000 workstation for 3ds Max rendering and am wondering what video card I should put in it. No, we cannot afford a Quadro :( . After a Q9550 and 8GB of RAM, I have exactly $260 left in the budget for the video card.

What card should I buy? I'm not sure exactly how 3ds Max uses the video card. Does it like processing power or a large amount of VRAM? I've also heard that consumer video cards can be soft-modded to workstation cards, which makes the work better in applications like these. Could/should I do this?

Sorry if I sound like a n00b; I've only built gaming computers before. 3D rendering is a requirement I've never tackled in the past.

Oh, and the card I buy should preferably be available at NewEgg. I need to buy and build this system by the end of this week.

Thanks guys. :)

Again, like any 3D application, the video card is used to draw the image on screen. The Quadro and FireGL cards are optimized for doing this within this type of application rather than games like a standard Geforce or Radeon is. The actual rendering of the final product for use in animation, movies, or even still images is handled 100% by the CPU and memory. Though I believe NVIDIA has an application that allows you to do some of this on the GPU now.

If you can't afford a Quadro then the next best thing is going to be a Geforce GTX 280. You may of course want to wait for the Geforce GTX 295 to become available as well.
 
They do need the same version of 3DS Max, as well as the same plug-ins if you are using any. However, installations for the purpose of network rendering are FREE. So you can do this as much as you want to. I've never had a problem with using different OSes for this purpose, but I've only done this between Windows 2000 and Windows XP which worked fine. As for setting it up, it really isn't all that challenging.

I stand corrected. I know in my case netrendering wouldnt not work going from XP to Vista. It may have a lot with going from a 32bit OS in XP to my 64 bit in Vista (which I assume the OP is doing as he has 8 gig of RAM).
 
I stand corrected. I know in my case netrendering wouldnt not work going from XP to Vista. It may have a lot with going from a 32bit OS in XP to my 64 bit in Vista (which I assume the OP is doing as he has 8 gig of RAM).

Well I haven't tried it between Windows Vista and Windows XP.
 
No, do not go easy on the CPU. You need CPU and RAM. There is a common misconception that the video card is actually used for rendering. The truth is, it doesn't do that at all. The GPU is used for rendering the viewport in the workspace. Like any 3D applicaiton the more complex the 3D work is the more powerful the video card will need to be in order for things to be rendered in the viewport in real time or close to it. When it comes to actually rendering frames for output (finished product) it is 100% CPU and RAM. The rest is meaningless. That is why render farms are essentially headless workstations much of the time.
QFT
Graphics card will make little difference. How about i920 w/8 threads and 6gb ddr3 is where its at
cartsh3.jpg
 
From what I understand from other posts at the Autodesk site, it has to do with what version (32 or 64) of backburner you installed. You cannot mix versions so if you had a XP machine that is running 32bit backburner than it will not mix with the 64bit version of backburner that may run on a Vista Machine. I do not remember if the newest edition of Max is 64bit by default either. Link to posts here
 
Well the video card makes all the difference. Just not when it comes to rendering times. You still need a decent card to render the viewport which in turn directly impacts productivity for the person creating the 3D objects, animations, etc.
 
I've read another user saying that he went from 4 hours to 20 minutes rendering, now that's quite the difference. In this case I think dan is right is important but it's not all that matters for the rendering :)
smiley2.gif
ecstatic.gif
 
Is there any way the VUE rendering products could use my GTX260? I thought they could ONLY use the CPU...
 
While getting a Quadro is best, realistically the line between the higher end graphics cards and the workstation cards is closing. The workstation cards GPU's are just optimized for rendering rather than gaming. All the new ATI FireGL cards are based off the RV7700 (my cards gpu) while the Quadro's are based off the GPU's from Nvidia's 200 series. They do outperform the regular graphics cards but you have to do a cost/benefit ratio to really decide. For the most part I can let my longer renders go all night while I sleep and in doing so the time I would save with a faster card may not matter as much as the cost difference. By managing when and how you render you negate the time savings

Actually the only difference between Quadros/FireGL and Geforce/Radeons are the amount of VRAM. The GPUs themselves have been identical for a very, very long time now (since the 9700Pro days, actually). The only other difference is the driver, which is what accounts for the performance difference in workstation benchmarks.
 
Actually the only difference between Quadros/FireGL and Geforce/Radeons are the amount of VRAM. The GPUs themselves have been identical for a very, very long time now (since the 9700Pro days, actually). The only other difference is the driver, which is what accounts for the performance difference in workstation benchmarks.

Exactly. The Quadro FX 5800 has 4GB of RAM. It is essentially the workstation equivalent of the Geforce GTX 280. I'm not sure if the memory and GPU clocks are the same or not, but I'd imagine that they aren't too far off from the reference Geforce GTX 280 clocks. Another difference is the price. The Quadro FX 5800 costs $3149.99 compared to the $374.99 of the Geforce GTX 280.
 
Exactly. The Quadro FX 5800 has 4GB of RAM. It is essentially the workstation equivalent of the Geforce GTX 280. I'm not sure if the memory and GPU clocks are the same or not, but I'd imagine that they aren't too far off from the reference Geforce GTX 280 clocks. Another difference is the price. The Quadro FX 5800 costs $3149.99 compared to the $374.99 of the Geforce GTX 280.

PNY's site lists the FX 5800 as having 102GB/s vs. the GTX 280's 141.7GB/s. If my math is correct, that would make its memory clocks 1594mhz (effective) vs. the GTX 280's 2214mhz (effective). No one seems to want to list core or shader clocks.

Since I'm not too familiar with workstation loads, I have a question for the experts. Does the extra VRAM really help that much to make up for being clocked ~300mhz slower? The FX 5800 has 30% less memory bandwidth, which seems like it would be a rather significant hit in performance. Usually whenever I see someone doing 3d work there aren't any textures or anything, which would greatly reduce the amount of VRAM needed - wouldn't it? I think I'm missing something here, I just don't know what...
 
I agree with some of what you're saying regarding the memory used in general 3d modelling. However, for anything requiring detail shots, memory gets used up fast. Besides the cad/dcc industry, medical and oil industries use these behemoths for virtual exploration. This excerpt should explain things in more detail:

"To get a better idea of who exactly requires that large a frame buffer size, we spoke more to Tyler Worden, Market Development Manager of NVIDIA who's in charge of the professional visualization solutions. From our discussion with him, it seems that while 4GB sounds a whole lot, the medical, engineering and especially the oil and gas industries can eat volumes of processing power and memory requirements in a jiffy. He explained the data which these industries handle to churn out their models for visualization is so immense that even if the Quadro card had 16GB of frame buffer, that wouldn't satiate their requirements.

For example in the oil and gas exploration arena, seismic charges are set off and numerous sensors are placed across a huge area to map the terrain to form an underground 3D map of sort. No longer is data just analyzed on specific areas, but these days a much larger area is mapped to get a clearer view of the macro scale of things and better determine the aspects of the entire terrain. This could be several kilometers in every direction and correspondingly requires a terrific amount of data manipulation and crunching power to generate the required analysis models. The huge frame buffer of the latest generation of Quadro FX cards help on two grounds:- one is on the CUDA-enhanced specialized programs used to tap on the GPU's crunching power and thus a huge streams of data are fed thru the GPU for accelerated processing. The other is on the actual handling of the visual model which is extremely huge and requires a boatload of memory to handle it fluently."

Read more of it here http://www.hardwarezone.com/articles/print.php?cid=18&id=2768
 
Back
Top