Battlefield 4 recommended system requirements: 3 GB VRAM

Forget what the PS4 has. Games are going to be made for the lowest common denominator which is the Xbox One which has DDR3. I doubt it has enough bandwidth or GPU power to deal with much more than 2GB of VRAM. Only time will tell though. Obviously games are eventually going to use more than 2GB of VRAM. The question is when, and how fast will 2GB not be enough at 1080p?
 
Forget what the PS4 has. Games are going to be made for the lowest common denominator which is the Xbox One which has DDR3. I doubt it has enough bandwidth or GPU power to deal with much more than 2GB of VRAM. Only time will tell though. Obviously games are eventually going to use more than 2GB of VRAM. The question is when, and how fast will 2GB not be enough at 1080p?

I think you might be missing the point.

The 360 has 512MB VRAM and the ports we are playing right now from that 512MB VRAM console are using up to 2GB VRAM. (Crysis3, Hitman, Metro LL, BF3 (w/ 8x+ MSAA).

Think about the explosion of VRAM usage when the XBONE has (I'm guessing here) 2-3 GB of the shared RAM to dedicate to the GPU!

We may certainly see 6+GB VRAM usage very soon!
 
My point as in my previous post is that the games were developed with PC specs in mind and then assets scaled down to Xbox 360/PS3 memory limits. The assets have always existed, and are being used on PC "ports". The difference with next gen is that the same assets made for the PC version can fit on consoles. This is why I don't think we'll see an explosion in VRAM requirements for a good 1.5-2 years (talking about 1080p here, which is what next gen consoles are targeting).
 
I think you might be missing the point.

The 360 has 512MB VRAM and the ports we are playing right now from that 512MB VRAM console are using up to 2GB VRAM. (Crysis3, Hitman, Metro LL, BF3 (w/ 8x+ MSAA).

Think about the explosion of VRAM usage when the XBONE has (I'm guessing here) 2-3 GB of the shared RAM to dedicate to the GPU!

We may certainly see 6+GB VRAM usage very soon!

I still don't get what all the FUD is from. Can you imagine how many computers there are that don't have 2GB video memory? Take a look at the Steam Stats and you'll see that just under 36% of people have only 1GB video memory. Why will we suddenly jump to 6GB+ video memory usage? Lol.
 
Forget what the PS4 has. Games are going to be made for the lowest common denominator which is the Xbox One which has DDR3. I doubt it has enough bandwidth or GPU power to deal with much more than 2GB of VRAM. Only time will tell though. Obviously games are eventually going to use more than 2GB of VRAM. The question is when, and how fast will 2GB not be enough at 1080p?

actually the xbox one has a higher theoretical bandwidth than the ps4. it will just take a few years to master, but im sure first party exclusive games will master it much sooner.
 
My point as in my previous post is that the games were developed with PC specs in mind and then assets scaled down to Xbox 360/PS3 memory limits. The assets have always existed, and are being used on PC "ports". The difference with next gen is that the same assets made for the PC version can fit on consoles. This is why I don't think we'll see an explosion in VRAM requirements for a good 1.5-2 years (talking about 1080p here, which is what next gen consoles are targeting).

Sure; but the assets for the PC ports were created with the understanding that ~2GB of VRAM was the upper target; few games breached that, even at higher resolutions, unless users went batty with stuff like MSAA.

Thing is, now the PC isn't the top spec- they're now targeting ~4GB of VRAM for graphics on the consoles, depending on the game. Are they just going to scale those assets down for PCs and not make them available at all? You're proving my point here :).

Expect them to ship those assets for the PC versions, even though few PCs have that extra VRAM. Of course settings can be lowered to accommodate those of us with the average 2GB-3GB cards, no problem. But also expect AMD and Nvidia to both release cards with more VRAM (it's relatively cheap and easy to do), and push developers to put those assets in the PC releases to create a demand for such cards. After all, both vendors are still on 28nm, so it's not like the cards are really going to get much faster. They're going to have to have some reason for us to upgrade!
 
I still don't get what all the FUD is from. Can you imagine how many computers there are that don't have 2GB video memory? Take a look at the Steam Stats and you'll see that just under 36% of people have only 1GB video memory. Why will we suddenly jump to 6GB+ video memory usage? Lol.

Can you imagine how many computers can play Crysis 3? Battlefield 3 MP? It's only a small fraction of the Steam Stats results right?

Yet the market is there, and growing. We're going to jump to >4GB of memory usage, >6GB in some cases at least, because the assets will be there- they'll be shipping on the console versions of these games. And it gives AMD and Nvidia an excuse to sell you a new video cards on the same tired 28nm node that just has higher-density modules welded on.

It's not like it's hard to make an 8GB HD7870 (oh look!) or GTX770, or a 12GB HD7970 or GTX780/Titan. AMD and Nvidia just need 'killer apps' in order to justify manufacturing them, and those are coming in the PC versions of next-gen cross-platform games.
 
It's not like it's hard to make an 8GB HD7870 (oh look!) or GTX770, or a 12GB HD7970 or GTX780/Titan. AMD and Nvidia just need 'killer apps' in order to justify manufacturing them, and those are coming in the PC versions of next-gen cross-platform games.
Except it's pointless. All those cards are too weak to make proper use of 6+ GB of RAM (including the Titan under gaming workloads).

We need next-gen cards before ANY of this discussion is in any way relevant.
 
I don't even know why NVidia released GTX 680s with only 2GB of VRAM. Guess i'm one of those suckers. Time for an upgrade :(.
 
I don't even know why NVidia released GTX 680s with only 2GB of VRAM. Guess i'm one of those suckers. Time for an upgrade :(.

Well, imagine the folks that bought GTX770s. I guess they won't be able to play anything in the year coming. If you put any stock into all the scary scenarios, you'll be a sucker once again. IMHO, of course. We're got folks already speculating that up to 6x the memory your cards have -- as perhaps being optimal.

I'm thinking since we're all on PC here, you can turn down settings and have a game perform well enough to be playable. Upgrade sometime later on down the road when GPUs have more processing power, not now for something with more memory, just because it has more memory.
 
actually the xbox one has a higher theoretical bandwidth than the ps4. it will just take a few years to master, but im sure first party exclusive games will master it much sooner.

Actually it won't take years to master how to use the eDRAM, most of the better looking games coming out use deferred lighting, everything from Killzone Shadowfall, to Crysis, Halo: Reach, Starcraft, etc. Deferred lighting allows you to have a large number of light sources without increasing performance costs, normally doubling the number of lights would have double the impact on performance in older video game engines. Deferred rendering requires a large amount of the memory bandwidth because you need to store multiple layers of a frame to be composited together. That is part of the reason most console games are not rendered in 1080p, they are hitting bandwidth caps. And when you just have 10mb (Xbox360) or 32mb (XB1) of eDRAM, there are going to be storage and bandwidth caps to work with.

Killzone Shadow Fall has already released a lot of information how their engine is going to work for the PS4, so I'm going to cite it as an example. They are already using 32mb of the PS4's ram as a buffer for their deferred lighting, and this is when they thought the PS4 was going to have 4 gigs of ram total. A PS4 launch title would already be maxing out that resource if it was on the Xbox One.

Here's a visual example of what they are using the frame buffer for

ygNMYA2.jpg


The top is the final frame you see on the screen, the second one is a bunch of buffer layers already combined to give you close to the final image, the rest are being captured and used to add reflection and light bouncing information to the final image.
 
Except it's pointless. All those cards are too weak to make proper use of 6+ GB of RAM (including the Titan under gaming workloads).

We need next-gen cards before ANY of this discussion is in any way relevant.

Yes, we do need to see next gen games- and you may very well be right!

But consider this. These games are meant to run on a console with an HD7870 at most- and can use up to ~4GB of VRAM.

My point all along has been that, unlike with current games, games developed on these consoles will be able to make use of more VRAM than we're used to in order to increase graphical fidelity without increasing the rendering load.
 
I don't even know why NVidia released GTX 680s with only 2GB of VRAM. Guess i'm one of those suckers. Time for an upgrade :(.

Yup- me too! And I even knew it at the time!

But I bought my cards when they came out, and I knew what my limitations would be. They have, and continue, to serve me well.

The only takeaway is that now is not the right time to buy a graphics card, unless that card has 6GB of VRAM or more, again, unless you need the performance immediately. Then just get whatever you can; obviously cards with 4GB of VRAM are about as fast as it gets without the price being unreasonable for a gaming-only card. Titans are obviously useful for other things too, and for those things they're a bargain :).
 
Planned obsolescence, dear fellow

Remember the 2GB/GPU GTX690 and 3GB/GPU HD7990. They could have charged whatever they wanted for models with more VRAM; I mean, it wouldn't have cost them $30 to double the memory on either card, and they could have raised the price another $200- what's another $200 when you're already spending a grand? Price obviously isn't an object at that point :).
 
But consider this. These games are meant to run on a console with an HD7870 at most- and can use up to ~4GB of VRAM.
Irrelevant when games coming out today can already swamp a Titan's GPU.

Games aren't going to suddenly stop using the GPU as much, even if they start using more RAM. All you end up with, even if your disaster-scenario comes true, is both the GPU being too slow AND not having enough RAM.

We need new cards before this can even be considered a problem. Current cards are going to run out of raw horsepower before RAM becomes a problem (unless you want to start running at console-like resolutions to decrease the demand on your GPU in order to artificially make yourself memory-bound).
 
You keep clinging to that idea instead of challenging it- that 'using more VRAM requires more GPU horsepower'. It's been mostly true, but not completely, since developers moved past the current generation consoles as development baselines.

That's why 'games coming out today' aren't a good point of comparison; they were developed within that old paradigm. Having consoles with ~5GB of RAM available to games, that previously had to make due with <256MB of RAM for everything but graphics, means that using 4GB for graphics in a game has become the new norm. And they'll do it within the performance envelope of an HD7870.

I really don't know how to restate this any better; I'm not terribly good at explaining the 'how we get there' stuff as much as I am at the 'where we're going' stuff. But that's where we're going.
 
You keep clinging to that idea instead of challenging it- that 'using more VRAM requires more GPU horsepower'.
I did not say that in my previous post. Not even close to the point I was making.

I said games, TODAY, can already swamp the fastest single prosumer GPU in existence, and that trend is NOT going to reverse itself. Next-gen games are going to hit the core even harder, and I'm sorry, but if the core can't maintain 30+ FPS (even given unlimited quantities of RAM to work with) then the whole discussion is pointless

We're already GPU bound, and we're going to remain GPU-bound even if RAM usage starts increasing. We need a faster GPU before we start worrying about making 6GB standard.
 
I did not say that in my previous post.

I said games, TODAY, can already swamp the fastest single prosumer GPU in existence, and that trend is NOT going to reverse itself.

We're already GPU bound, and we're going to remain GPU-bound even if RAM usage starts increasing. We need a faster GPU before we start worrying about making 6GB standard.

You said that using more VRAM today requires more GPU horsepower; I'm saying both that a) we have examples to the contrary and b) the very existence of these consoles challenges that notion.

Sure, what you're saying has been mostly true- and I've tried to explain why it doesn't necessarily apply to games developed for these new consoles. The developers will find a way- and it's going to be a lot easier to do that than it was to get a game running on any previous Playstation at all.

Build it, and they will come, or so they say. The PS3 turned out to be quite the gaming machine, despite being one jacked up piece of hardware to develop for.
 
You said that using more VRAM today requires more GPU horsepower.
You're thinking of a post made earlier in this thread where I was making a different point, not the one you just quoted. Keep up...

We know the following two statements to be true:
- Current gen games already hit the GPU hard enough to verge on unplayable frame rates (even when well below the memory cap of the hardware).
- It is HIGHLY likely next-gen games will hit the GPU even harder, making performance even worse (even when well below the memory cap of the hardware).

Therefor:
- Current GPU's are not fast enough to warrant 6+GB versions existing. Odds are that they will be too slow no matter how much RAM is available on-card.
- We should reconvene when Nvidia or AMD launch a significantly faster GPU core that actually makes this a relevant problem.

Yes, I'm cutting your argument off at the knees. Doesn't matter even if you turn out to be dead-on (though I still highly doubt the memory usage of games will expand to the proportions you propose), with the above being the case, it's moot.
 
Last edited:
I wonder if all the extra ram is largely going to be used as a hard drive cache, loading at least 8x as much data off a slow hard drive is going to take a long time, even if "only" 4 gigs of ram are actively being used a ram, having a large cache of ram allows content to be preloaded more smoothly.
 
Let me make one thing clear- I do appreciate your continued participation in this discussion; you've added a lot to this thread and have helped all of us flesh out the ideas we're discussing.

We know the following two statements to be true:
- Current gen games already hit the GPU hard enough to verge on unplayable frame rates (even when well below the memory cap of the hardware).
- It is HIGHLY likely next-gen games will hit the GPU even harder, making performance even worse (even when well below the memory cap of the hardware).

For the first one, almost exclusively, but not completely. Think about which features of current-gen games really hit the GPU hard, and which don't. Think about which features require more VRAM, and which don't. I've mentioned mods in Skyrim that can dramatically increase the VRAM usage and graphics fidelity without really impacting performance significantly unless you actually run out of VRAM- and the same holds true for the 'High-Resolution Texture Pack' Bioware released for Dragon Age 2. Textures aren't the only thing that can use more VRAM and increase graphics fidelity without increasing the load on the GPU significantly, but they are the low hanging fruit, as the graphics artists creating them are working on higher-resolution renders and then creatively down-sampling and compressing them to fit on each platform for which the game will be released.

For the second, you're right- except that you can expect game developers to find ways to use the extra VRAM on these consoles without overloading the GPU, and that GPU is all of an HD7870, at best. Now, the same engines that they're using on these consoles are already running, and have been running, on PCs for years in earlier iterations, so we can expect the console releases of these games to be capable of being more GPU intensive than they are on the consoles, for sure. You'll need every ounce of performance you can get, if you want to run everything at max settings. But what about the extra assets? The only way we're not going to need more VRAM to max out these games is if developers deliberately downgrade the graphics for these games to target the average 2GB video cards that high-end gamers are running. Do you really see that happening?

Therefor:
- Current GPU's are not fast enough to warrant 6+GB versions existing. Odds are that they will be too slow no matter how much RAM is available on-card.
- We should reconvene when Nvidia or AMD launch a significantly faster GPU core that actually makes this a relevant problem.

Most current GPUs are fast enough- or faster to warrant more VRAM, at least, but as developers have been stuck in the development paradigm first of trying to break free of the current console generation's limitations and then trying to push performance on the PC alone- and while they've gotten pretty far, all things considered by looking at BF3/Crysis 3/etc., they're still operating with less VRAM than they actually need; hence, games still have stupid-low resolution textures in many places where they could use more detail; try not to look too closely at anything, etc. And again, employing higher-resolution textures, as an example, wouldn't kill performance unless you ran out of VRAM.

For 'significantly faster GPUs', well, we'll have to wait until this time next year. In the meantime, games are already showing that they're using more VRAM outright than they have been, and they're designed to run well enough on current-generation GPUs. So while the problem will be even more relevant next year and the year after, it's plenty relevant right here, right now, for people looking to upgrade and get more than six months of top performance.

Sure, faster cards will come out, but not having enough VRAM for easy things like textures means that you have to make the decision (like I will) to either compromise significantly on graphics fidelity or compromise significantly on performance. Which boils down to one piece of advice- either buy a card with a lot of VRAM now, or wait until less expensive cards release with more VRAM than they currently are.

Yes, I'm cutting your argument off at the knees. Doesn't matter even if you turn out to be dead-on (though I still highly doubt the memory usage of games will expand to the proportions you propose), with the above being the case, it's moot.

You're not really 'cutting my argument off at the knees' as much as you're avoiding it altogether, which is fine; it's not like it makes a damn bit of difference for either of us, except to hammer out our points of view.

But I have two decades of watching this industry to see just exactly how transitions like this take place, and the indicators, to me, are blindingly obvious.
 
Any beta testers in here? How is performance comparable to BF3?

There have been a few tests from the Alpha posted, but they're pretty rough. The real Beta starts on the 1rst, which most of us are probably signed up for (I am) either by having BF3 or BF4. We'll really have to see how it works out, though I'm not hopeful for my 2GB GTX670 cards to be enough to run the game at the highest settings VRAM wise or to be able to push that out to 2560x1600 at a reasonable frame rate. I may try my 1920x1200 monitor instead to get a feel for it.
 
I wonder if all the extra ram is largely going to be used as a hard drive cache, loading at least 8x as much data off a slow hard drive is going to take a long time, even if "only" 4 gigs of ram are actively being used a ram, having a large cache of ram allows content to be preloaded more smoothly.

You can bet, as others have noted here as well, that the 'extra' memory will get used by some games for pre-loading, particularly to smooth out wide-open environments. There's a whole lot of possibilities here!
 
I hope my m18x r1 will run it smooth.

I know my Clevo with an i7-3610m and GTX675m (GTX560 silicon and clocks, or close to it) is going to struggle, but it does have 2GB of VRAM (and 16GB of RAM), so it should get by with better than 'low' settings, I hope.

Does your Alienware have an SLi setup?
 
Sure wish SLi VRAM was additive. Ugh. When's BF4 getting released again? (skipping the beta)
 
And yet another common misconception that VRAM is ALL that matters, im sure they'll have a 3GB gtx750 and all the NVidia lovers will say it runs everything on "max settings"
 

People have already been reporting VRAM usage of >2.5GB at 1080p. Not saying that the posted benchmarks above are incorrect, but contradictory evidence does exist. At least with BF4, though, we'll be able to largely get away with 1GB and 2GB cards by dropping the settings, and 2GB cards should still be able to push reasonable detail levels.
 
And yet another common misconception that VRAM is ALL that matters, im sure they'll have a 3GB gtx750 and all the NVidia lovers will say it runs everything on "max settings"

I'm really happy that you joined the [H] just to say that!

Now please find the quote (from anyone here) that states that 'VRAM is ALL that matters'.
 
People have already been reporting VRAM usage of >2.5GB at 1080p. Not saying that the posted benchmarks above are incorrect, but contradictory evidence does exist. At least with BF4, though, we'll be able to largely get away with 1GB and 2GB cards by dropping the settings, and 2GB cards should still be able to push reasonable detail levels.

In BF4 memory usage at 1440p was at 2680MB max for me. FPS was all over the place, I believe down into the high 30s. Specs in signature.

BF3 goes up to 2+GB at times, but that's just from memory when I was playing at Noshar Canals in 64-member conquest. Haven't really played much these days.
 
In BF4 memory usage at 1440p was at 2680MB max for me. FPS was all over the place, I believe down into the high 30s. Specs in signature.

BF3 goes up to 2+GB at times, but that's just from memory when I was playing at Noshar Canals in 64-member conquest. Haven't really played much these days.

Hell, I haven't gotten a day off recently to dig into BF4- I'll be able to enumerate the issues further when I do, up to 2560x1600 at least with 2GB cards. Your experience does look typical, I'm glad the game isn't too out of control, especially considering the increase in fidelity- thanks for sharing!
 
In BF4 memory usage at 1440p was at 2680MB max for me. FPS was all over the place, I believe down into the high 30s. Specs in signature.

BF3 goes up to 2+GB at times, but that's just from memory when I was playing at Noshar Canals in 64-member conquest. Haven't really played much these days.

Not sure what's wrong, but I'm running 1440P on a 7950 @ 1.15 GHz and I get a steady 60-70 FPS unless the game is stuttering (not a graphics related issue). I do use a similar amount of VRAM. There is defintely something besides graphics lag going on in the Beta, but I'm not sure what it is. I think so if it is network based or maybe battlelog based since when more people (globally, not just on 1 server) are playing it's worse.
 
So, it's settled then, yes? We will need more RAM in the future, right? OK, I just need to be sure.
 
Tried BF4 again and my lowest dip was into the 40s. Lots of smoke, explosions, it was as I got blown the heck up. It was better this time around.
 
Well i have a titan so I dont care about vrram requirements. ...in fact the vram is what attracted me to the card in the first place.

Iray will not render when it runs out of vram, it will not use Vram from a sli card. the new 12gb quadro cards are temping me.

Go big on ram or go home is my modo max or nothing. I already saw where things are going to go for games a long time ago the same place high end rendering is going. game devs love nothing more than to use the uncompressed high detail game art and not dumb it down if they can they will. the raw art will make tiny cards cry before they compress the hell out of it and or turn it into normal maps to make it fit in the game engine
 
Last edited:
Back
Top