Who uses AA and FSAA? Really?

PSYKOMANTIS said:
As for those saying my monitor sucks... My monitor does a max res of 2048 X 1536 at 75Hz and beyond... (at least maximum least common denominator resolution)
I highly doubt I'm not missing anything on a high end CRT monitor.

Psyko M.


Not saying your monitor sucks, I bet it's exactly the same monitor I have (now it's on my wife's PC) It's definitely smoother motion wise than my LCD, but the detail just is not there...in comparison. I couldn't even use it for a second monitor because the picture quality difference is THAT much when the two are sitting side by side.

In retrospect, I think the aliasing bothers me more on the LCD than the CRT simply because the CRT has a less sharp image but I can still notice enough to bother me on the CRT, no matter the resolution I have to use FSAA.
 
PSYKOMANTIS different people have different eye sight's, I can see jaggies a mile away on a 19'' above monitor that's how much I can see jaggies with my eye sense.Your eye's are different then mine so that may mean you eye's are not has sensitive to IQ has mine or others.

I like to have every possible setting up high and still have a constant 60fps its called EYE CANDY.

RIght now I would love to have teh AMDFX60 with ATi's X1900 Series Crossfire on Dell's New 30'' Monitor with every settings on Full but I currently just have teh stuff in my Sig I can only run games like Fear and Call of Duty 2 with 2xAA on medium AF, but games like Counter Strike Source I have everysetting up full and still have a60+ fps, I just cant stand teh game with lower settings that why I want teh latest and greatest.
 
the gamer said:
PSYKOMANTIS different people have different eye sight's, I can see jaggies a mile away on a 19'' above monitor that's how much I can see jaggies with my eye sense.Your eye's are different then mine so that may mean you eye's are not has sensitive to IQ has mine or others.

I like to have every possible setting up high and still have a constant 60fps its called EYE CANDY.

RIght now I would love to have teh AMDFX60 with ATi's X1900 Series Crossfire on Dell's New 30'' Monitor with every settings on Full but I currently just have teh stuff in my Sig I can only run games like Fear and Call of Duty 2 with 2xAA on medium AF, but games like Counter Strike Source I have everysetting up full and still have a60+ fps, I just cant stand teh game with lower settings that why I want teh latest and greatest.

thank you [H] for all the responses! :D
If anything I've read alot about personal preference here.

I think what I'm going to do is take the earlier advice and turn things up a notch and try to play CS:S for a solid 2 weeks. Then I'll go back and see if I can see any difference.
Now I know the 6800GT is old news but it still is NO slouch of a card.

My friend, I can only dream of such a setup at this point in my life.
with my new 850 a month rent plus bills = no new computer parts.
... now only if I can win the lottery.

Oh and P.S. yall, I don't "hate what I can't have." In my time I've had the top end cards since the days of my 8MB Matrox Mystique, dual 12MB Voodoo 2s, and such.
I'll put cash down that I was pulling 60FPS on most of my systems before some of you were even gaming LOL. The fact is that I've always had high end hardware, yet I'm pretty much approaching this topic from the olden days of pre-filtering and pure polygon POWER.

Call me old school but I just don't see why you have to turn eveything on when running high res. I guess the old saying goes, "don't knock it if you didn't try it"
*sigh*
 
AF is a must with me. I run High Quality AF mode at all times. 2xAA minimum too. The game just looks miles better with the AF cranked.
 
not long upgraded my gaming rig to SLI. I run this through my Samsung 19inch 4ms LCD monitor. Native res is 1280x1024 and this monitor will do 75hz at this res. The rig is prob overkill with the monitor res I reckon, but this means I can play everything with 8xAA and 16xAF. Well everything I've played so far except FEAR, I have to drop the AA to 4x for the stresstest to not drop below 40fps.

I'm hoping that it will be a while until this isnt powerfull enuf to max IQ settings.

AF is a must and tbh so is AA at 1280x1024, just a question of how much with the rig you're using.
 
4xAA 16xAF regardless of resolution which should be a minimum of 1280*960 or 1280*1024. If my video card cannot do 1280*960 w/ 4xAA 8xAF minimum, I upgrade my system or video card. That is my absolute minimum limit on my 21" 4 year old CRT that does 1600*1200 85 Hz max resolution.
 
There's simply no excuse for not having AF turned on. It causes virtually zero slowdown and it makes textures look 54729345 times better.

And for anyone that says AA is useless above 1280x1024 or whatever, speak for yourself. I play games at 1600x1200. If I have to turn off AA to get a decent frame rate, the edge "creeping" can be pretty unbearable. It IS noticeable.
 
I play without or with at VERY lowest. It doesn't help gameplay (to avoid an argument, I am going to add a TO ME at the end of that sentence), I would rather have the eyecandy, then to not to have jaggies, just my opinion...
 
I do. I wouldn't play without it!

1600X1200 4xAA/8xAF and I get more than 60fps in most games.
 
I didn't bother reading past the 10th post so pardon me if this has been beat up already.

You CAN notice the difference even at 1920x1200 between an anti-aliased image and a non-anti-aliased image. My rig supports it in most games. Take F.E.A.R. for instance. That game has this awesome per pixel rendering thang going on but the jaggies ruin it a bit, even at 1600x1200. It's noticable. All it took was 2x AA and it is sooo easy on the eyes now.

Anisotropic filtering is really important as well. My minimum is 8x but I don't I have run less than 16x since purchasing an 850XT last year. I like to actually see the textures the way the designers intended.

The only case I see where you wouldn't notice the jaggies too much would be if you had one of those ultra performance 17" monitors that go above 1600x1200 (Samsung, Sony). I've own 20-21" monitors for the most part and only 2056xwhatever comes close to eliminating the need for AA.
 
PSYKOMANTIS said:
Now I know the 6800GT is old news but it still is NO slouch of a card.

actually, it is, in newer games

Call me old school but I just don't see why you have to turn eveything on when running high res. I guess the old saying goes, "don't knock it if you didn't try it"
*sigh*

also, people play games for different reasons, some are competitive and really don't pay attention to IQ like yourself, while others do, everyone sees and perceives games differently, some notice aliasing and blurry textures and it bugs them, others don't notice it, either because they simply don't notice it or they simply don't have a card fast enough of using those features at playable performance
 
AF should should be switched on by default. It should be mandatory in everything. There's simply no reason to have anything less on today's cards. It saddens me that there are so many clueless people out there running games with horrible filtering, unaware that it could be looking so much better
 
meatfestival said:
AF should should be switched on by default. It should be mandatory in everything. There's simply no reason to have anything less on today's cards. It saddens me that there are so many clueless people out there running games with horrible filtering, unaware that it could be looking so much better

I feel the same way, that's what I was getting at in my conclusion here http://www.hardocp.com/article.html?art=OTUzLDEz about how AF should be a given, on at all times automatically

the performance is there in today's cards for that now
 
Brent_Justice said:
I feel the same way, that's what I was getting at in my conclusion here http://www.hardocp.com/article.html?art=OTUzLDEz about how AF should be a given, on at all times automatically

the performance is there in today's cards for that now
By that, which cards are you referring to? The uberly expensive ones? Like the 7800's and 1900's? Until I can see an actual difference in gameplay (i.e. - makes me better) I refuse to pay THAT much for anything, to get a few more settings. Just my opinion.
 
even my friend's geforce 6200 can use 8x AF without slowing down. You don't need a powerful card for it.
 
I'm sensing that perhaps video card reviews should have a few extra tests--maybe instead of doing a bunch of testing only to choose one resolution/IQ setting, we should see multiple data points for each card. To be honest, if a game can run at 1600x1200 with no AA but 4x AF, that information isn't the most useful to those who have an LCD at a native resolution of 1280x1024. Brent, maybe it would be more informative if the video card reviews illustrated the best playable settings for common resolutions (1920x1200, 1600x1200, 1280x1024, 1024x768). Then each person can see how a video card performs at their specific resolution.

 
PSYKOMANTIS said:
You know I've never really understood the point of FSAA or AA.
Sure it smooths out the infamous "jaggies" and makes games look pretty at low resolution. Great... FINE! FSAA and AA have a place for their duty in LOW RESOLUTION gaming.

What I'm asking is why do people still care soo much about FSAA and AA what so ever?
Supposidly you have a good enough monitor that supports 2048 X 1536 @ 75+ HZ...
The point of buying a good video card is high frame rate and stability in high resolution gaming, that's pretty much a given.

What I don't get is once you push your PC to 1600 X 1200 and beyond you really don't notice the jaggies anymore because you pretty much compress everything onto your screen.
What I'm getting to is why stop and smell the roses and use FSAA and AA at high resolutions and hamper your FPS? I don't see FSAA or AA improving my CS:S scores or my K/D ratio in BF2...

I personally own a 21" Trinitron SGI RGBHV monitor I use for photo editing.
When I crank up the FSAA and AA I really don't notice anything but a slowing down of my FPS and more system chugging.

What's really pissing me off is when hardware websites (including hardOCP) do reviews with all the FSAA and AA on. What about pureists like me that don't turn that shit on? What ever happened to HARD FPS facts without all the magic BS?
Its too fucking confusing to see cards pitted against eachother on how well they perform Quality settings.

Honestly, am I missing something?
I believe you are missing something. The only time you need to worry about FPS is when they drop below 30. There's no reason to try and make sure you have 100+FPS when you won't notice the difference between the two. It's all about graphics quality. Games 6 or 7 years ago would probably run fast as hell on a current machine yes? Well that's great, but they look like shit. My point is - game graphics and 3d quality have improved immensely, and I want my games to look as real as possible (while retaining a decent 30+ fps). If you can cut down on the "computerized look" by using AF and AA, then it's a worthwile feature. While I only run at 1280*1024 - I noticed a major difference in the graphics quality when using as much AF AND AA filtering as I can without degrading my FPS to below 30 frames per second. This rarely happens except for a FEW (F.E.A.R.) games with my fairly high-end computer. Why is it so hard to grasp the idea that an average hardcore gamer would want the best possible graphics quality available?
 
I have an LCD with a native res of 1280x1024, and speaking from my personal preferences, I need to have 4x AA at that resolution, or the shifting edges of jaggies actually distracts me from what I'm doing in the game. As for AF, I usually run it at 2x or 4x, as it helps the image quality but not nearly as noticably as AA.
 
Ok, my opinion?
I play at 1024x768 or above (1280x768 on my htpc/plasma, 1024 or 1280 on my desktop depending on the game)
AF is noticeable for sure, I crank it to the max, the ground in front of you looks like poo especially without it.
However, I find AA to be pointless and never use it. I agree that it is very noticeable if you are standing still lollygagging and "looking at the scenery" but I play mostly FPS when I have the time and find that the jaggies are not noticeable in a fast paced game, not worth the drop in frame rate. Now it may be different for those of you who play slower paced games. If there was time to stop and look for small details, I could imagine it being useful.
:D

P.S., for the guy above who said 30+ fps was good so we don't need to worry as much about fps now? Ah, WTF? Every FPS game I've ever played looks choppy with anything below ~45-50, usually the low mark for smoothness (anything above not noticeable) is 60fps, so yeah, fps is still important, especially when the cards still don't keep up with the games unless you want to spend your life’s savings.
 
I believe you are missing something. The only time you need to worry about FPS is when they drop below 30. There's no reason to try and make sure you have 100+FPS when you won't notice the difference between the two.

That's the most retarded thing I've ever heard. 30 fps is CRAP. Hell even 60 fps is barly playable IMO. A true "hardcore gamer" would want the highest frame rate they can get to give them an advantage. Load up cod2 in DX7 mode and uncap the fps (set it to 500) and see what the game looks like at 125+ fps then cap it at 60. Everything looks slower.
 
Well, to me gaming is about having fun and beating other people with skill, not my equipment superiority. It's more fun for me if the game looks realistic and immerses me more than if it runs at 8 million fps and looks like crap. I guess I'm just not 'hardcore' enough in that I game for fun, not to beat the shit out of people.

I'm more into racing sims than fps's, and in those aniso. filtering is virtually required or you can't see corner apexes or braking and turn in points in the distance for crap. And I don't know about you, but if I look at a guard rail at a bit of an angle in real life, it doesn't start to develop a staircase edge or start shimmering as my viewing angle changes, so AA makes a difference. It's less distracting more than more immersive to have it on. However, I have to keep my framerates above 30 at all times, so I run with whatever gets me that.

When I fire up an fps, it's purely for fun and enjoyment so I turn the eye candy up as much as I can without it affecting gameplay. This also seems to be about the 30fps range for me, though the occasional dip below that doesn't seem to have as big an effect as in racing sims.

I'm not saying I can't tell if it's running faster than that, I usually can, but the difference is rediculously small from 30fps to 60fps vs AA and aniso on to off, where it's a much bigger difference.

On my 21" crt, the difference from 1280x1024 to 1600x1200 wasn't that big, and I would choose 1280 with AA and aniso vs 1600 without AA every time as it just looked better. Now that I switched to a 19" lcd, 1280x1024 is native so the choice on res. is made for me and I run as much AA as I can without the fps dropping below 30.
 
A perfect game for pushing your card to the max and SEEING what it can do: Madden 06.

Even at 1600x1200 4xAA/16xAF, there is NOTICABLE room for improvement with jaggies and texture blurring. Unfortunately, my GTX-256 wasn't good enough to run 8xAA, even at 600/1600, and now it's broken so no SS's from me and my Radeon 7000.

If another member would post SS's of Madden 06, it'll become obvious. Same with NFSMW and AF (motion blur takes care of a few jaggies AND low FPS). Not so much into FPS games, so can't say there....but with sports and racing games, there's lotsa room for improvement even at the settings that people call great.
 
I use 4XFSAA and 16XAF on mine, but I have a 19" LCD monitor that has a native res of 1280X1024. I pretty much have to game at that. I can't get any higher and I don't want to go lower. My single 7800GT does just fine at that. I pretty much only play World of Warcraft, and it is helped quite a bit by both FSAA and AF, so it comes out looking it's best. I don't think I could get better image quality going to 1600X1200, so I'm quite happy. I get 60-70fps out of it most of the time. With my X2, I have had little issue with system lag, but the frame rate sometimes shows 30-40 when there's a crowd. I never notice. I do get serveer lag, as all players of WoW get. I have run a few other programs with it, UT2k4, Quake3, Doom3, Rise of Nations. They all have their own issues, but they perform well at 1280X1024 4XFSAA and 16XAF, so I'm good with it. My old video card, a 6800GT, had issues with FSAA at that res, so I upgraded specifically for that. I don't think I'll need anything higher for a long time, unless someone comes out with a cheap (sub $300) LCD or OLED monitor with a higher res.I don't think we'll see those for a long while.
 
Well, i have both types of monitors, LCD and CRT.
My LCDs are a DELL 2005FPW and a DELL 2405FPW
Both have a res of 1680x1050 and 1920x1200
I should say that even with those teorically higyh res i can still se MUCH of the jaggies.
I have a 19 inch crt display from LG T910B that can do 2048x1536, and in that res the jaggies with this "small" sized monitor really disappear.
So I can say that in my opinion, in medium sized monitors with high res u can do without the AA, but with big sized ones or medium sized with medium to low res (1280x1024 for example) u can't ever let go of AA in my opinion.

I would like to say that AF 8x up is something that I'll NEVER turn off.
I am someone that pleads for a better image quality with a normal frame rate than a 400+ FPS with shitty image.
 
What is the point of gaming performance provided by video cards?

1. Higher frame rates
2. Higher quality result

Point 1. seems straightforward, and it is, but only to a point, after which the brain perceives the motion as seamless. What that point is is a subject of argument, which I won't get into; but there is a point where everything looks smooth. This point certainly is not lower then 30fps, although it may be higher. Framerates above the point of fluidity are wasted, by definition, because they provide no additional value to the player.

Point 2 is very complex. What is quality? Quality can come in the form of higher polygon count to produce a more complex scene. It can come in the form of more advanced effects such as soft shadows. It can come in the form of higher resolutions to provide additional detail. It can come in the form of anti-aliasing so that thin lines aren't lost to sub-pixel rendering and diagonal lines appear smooth; and it can come in the form of filtering so that textures rendered at different angles appear correctly.

Of course this is entirely arbitrary! Much of it depends on the game in question. If your preference is for online FPS gaming, maybe all this fancy stuff is meaningless because all that matters is maximal frame rate. If you like flight-simming you might like seeing the radio wire on your F4U displayed instead of lost, and the other plane you're pursuing will look much clearer with anti-aliasing. As bobzdar pointed out race sims really are better with anisotropic filtering as high as possible. A MMORPG fan might want as much AA,AF, and fancy effects as possible, because the back-end is doing all the game calculations. And some people prefer very high rez, others prefer lower-rez with other quality settings higher, and still others are on a fixed resolution (mine's 1440x900, 16:10 widescreen LCD panel).

[H] picked their set of standards. They may not match yours. For instance, in their Everquest II reviews, [H] elects to use the 'Balanced' setting. I presume this also means they're using the 'Medium' texture settings, which are suitable for 128Mb cards. With those settings, if you compare, say, a 6600GT 128Mb and a 6600 256Mb, the former card will certainly outperform the second. On the other hand, the latter card would enable players to use the 'High' texture quality, which is certainly a significant quality improvement from 'Medium.' Should [H] fiddle with this setting in their reviews? No; EQ2 happens to have a phenomenal number of graphic options and if the review is going to start fiddling with them all they'd never get a review done, nor would it be a review people could use to compare cards generally. Do I, as a EQ2 player, want to consider this other factor in a buying decision? Yes. But I don't expect [H] or anyone else to provide *the* perfect answer to every gamer's question across the huge range of systems, games, and video options available.
 
Some people like logs. Some people like frogs. Some people like busses, bolts and coins. Some people like truffels, Colts and milk. Some people like gum, ginger and beans. Some people like framerates. Some people like fullscreen multisampled anti-aliasing. Some people like yellow paper, hot dogs and bees.

And some people like everything!
 
I personally can't believe that in 2006 there are still people around who don't see the benefit of aa/af, especially on this forum. I still see lots of jaggies at 1600X1200 res. I prefer 1280X960 res with 4xAA/8xAF over 1600X1200 res with no aa/af. The difference is just too much. If someone only cares about fps only, then why not turn the game to the highest res possible and all details on low to get the highest fps?...cause to me sacrificing aa/af for res is akin to turning down details on low for the sake of more res.
 
Good grief I feel like such an idiot.

I was going to take some HL2 screenshots to support the claim that I couldn't see any difference between 4xAA and AA off at 1600x1200, when I discovered I had been running with 4xAA all along, when I thought it had been turned off.

So I tried turning it off. Yikes.

In the words of Emily Lattella, "Never mind." :eek:
 
i never really understood the point of larger displays, such as a 21", as opposed to two 17" monitors (CRT or LCD) resulting in a lot more room to play with in everything else (games aside). the extra four inches for a 21" typically takes your wallet for the price of two 17"ers and leaves you without 13" of real estate. now granted my two LCDs have a max resolution of 1280x1024 but that's where using AA/AF comes in. in most everything my X850 XT affords me 4xAA and 16xAF whereas with say FEAR and UT2004 i have to downshift to 2xAA and 16xAF. either way i may have a smaller screen/resolution but everything still looks superb.
 
z3r0- said:
That's the most retarded thing I've ever heard. 30 fps is CRAP. Hell even 60 fps is barly playable IMO. A true "hardcore gamer" would want the highest frame rate they can get to give them an advantage. Load up cod2 in DX7 mode and uncap the fps (set it to 500) and see what the game looks like at 125+ fps then cap it at 60. Everything looks slower.
60fps 'barly' playable... LOL Ok mr. robot eyes. Sorry, didn't know you didn't have human eyes. my bad!
 
If your not turning on AA/AF then why bother spending all the money on PC gaming just buy a console and game on you friggen TV....

I setup a profile for each game I play... most of which are now just set to applicaion preference sine games now have ingame setting for aa/af...


when I play on systems that have no aa or turned on it hurts my eyes .... plus having sometimes 2x or 4x aa on actauly may give you faster FPS because the card will have a sweetspot by turning that feature on...

so you people that say I turn off aa just to get a FPS may be shooting themselves in the foot...
 
pmrdij said:
i never really understood the point of larger displays, such as a 21", as opposed to two 17" monitors (CRT or LCD) resulting in a lot more room to play with in everything else (games aside). the extra four inches for a 21" typically takes your wallet for the price of two 17"ers and leaves you without 13" of real estate. now granted my two LCDs have a max resolution of 1280x1024 but that's where using AA/AF comes in. in most everything my X850 XT affords me 4xAA and 16xAF whereas with say FEAR and UT2004 i have to downshift to 2xAA and 16xAF. either way i may have a smaller screen/resolution but everything still looks superb.

If you ever gamed on a 21" crt, you wouldn't feel that way...However, if you ever had to lug a 21" crt around, you'd end up with an lcd :). I was very tempted to go for a 21" lcd or the 24" widescreen dell, but couldn't justify all that extra money. With crt's, 21" is pretty cheap now, doesn't make sense to get anything smaller...
 
Playing WoW at 2304x1440 on my 24" Crt I can say there is an image quality difference b/w using aa/af and not. It truely is just icing on the cake but whatever I like it.

1900xtx for teh win!
 
bobzdar said:
If you ever gamed on a 21" crt, you wouldn't feel that way...However, if you ever had to lug a 21" crt around, you'd end up with an lcd :). I was very tempted to go for a 21" lcd or the 24" widescreen dell, but couldn't justify all that extra money. With crt's, 21" is pretty cheap now, doesn't make sense to get anything smaller...

what?

sacrificing IQ and paying 3 times as much $$ so that the monitor most people never move would be easier to carry?.......whats wrong with that picture, i mean, who, other than the lan party freak moves their monitor enough to make that worth it, and who isnt physically capable of moving 100lbs once or twice in its lifetime?
 
I'm in the no AA boat myself. It is the last thing I turn on as a perk. I fall under the unfortunate category of people that get motion sickness playing FPS games. In my case I can play most of these games if the framerate stays at a solid 60+ FPS and not dip below that threshold too often. But there are some that I can't play even if I'm getting solid framerate - it depends how the engine is designed and how well it reflects real-life movement.

For F.E.A.R., I tried 800x600 with everything on (including soft shadows) or 1600x1200 (w/ everything except AA and soft shadows). I find that I prefer the higher resolution over extra quality. Actually 1024x768 is also very playable in my setup with everything enabled, but having an LCD w/ 1920x1200, I prefer to run everything at native resolution or at a resolution that scales well. I find when I turn on any AA on the game at that resolution, the levels with more open spaces or more enemies still drop below 60fps often and for long enough periods to make me have to turn off the game and lie down for a moment (ah, the pain I go through to get my fix :( ).

I do admit AA adds more to the game (sometimes the edge crawl makes me jump in F.E.A.R. because I think that there's an enemy moving in the distance), but for the general case everything is moving so fast I really don't notice it.
 
For me FSAA and AA are a gaming necessity. Our technologies have come so far along that it’s stupid not to run them. Yes you do take a performance hit, but it just looks so nice. Even consoles are now using it. The X360 IMO does a great job of running it (when games are coded correctly) and so should the PS3.
 
nobody_here said:
what?

sacrificing IQ and paying 3 times as much $$ so that the monitor most people never move would be easier to carry?.......whats wrong with that picture, i mean, who, other than the lan party freak moves their monitor enough to make that worth it, and who isnt physically capable of moving 100lbs once or twice in its lifetime?


Ummm, I think you took what he said the wrong way. I think the point he was making is that if you are a gamer, the only reason to own an LCD is for carrying to LAN parties. If you don't go to LAN parties, then a nice big CRT will give better IQ and much better flexibility when it comes to resolution.

Also, for what it's worth, I can also see a massive difference in IQ at 1600 X 1200 when AA / AF are turned on or off. Anyone who doesn't see this difference should check to see if they are having equipment problems with their monitor or video card.

Or, maybe they should stop by the Optometrist's office. ;)
 
FSAA, yea i could see it as not being "necessary" but its still nice... AF? are you kidding me games look like complete crap without AF.
 
nobody_here said:
what?

sacrificing IQ and paying 3 times as much $$ so that the monitor most people never move would be easier to carry?.......whats wrong with that picture, i mean, who, other than the lan party freak moves their monitor enough to make that worth it, and who isnt physically capable of moving 100lbs once or twice in its lifetime?

Engineers in the semiconductor industry that have to shift jobs every couple of years...I bought my 21" monitor 5 years ago and have had to move it way too many times, this last time I left it with my parents and bought an lcd so I wouldn't have to carry the damn thing again. The image is sharper on the lcd and tbh, I don't game enough anymore to where the ghosting bothers me much. The 21" was nicer overall, though, just didn't want to lug it to VA for my latest job switch.
 
Back
Top