What is going on at Beyond3D.com?

Brent_Justice said:
Is that all you can do?
Atleast explain how one game shows me how another game can perform, you being the [H] video card mans should be able to right me a book about it :D
 
batty said:
Ya go ahead, I won't feel it through the spandex anyways. Or is rubber nowadays?



Anyone can use it.
Spandex? why?
Ya that's what I meant..
 
This thread went off on such a tangent I think we made the geometry teachers of the world cum.
 
Moloch said:
Atleast explain how one game shows me how another game can perform

you asked how can doom 3 performance tell you about HL2 performance, well, it can't! duh!

that isn't the point

the point is to relate gaming experience, which involves performance and IQ of games

you can't find out the performance/IQ of a game without using that game, there is no ifs ands or butts about that

but by using the latest games out there with the latest gaming engines you can make educated predictions on future games that might be using those engines

you can see trends start to form, Card A may be better at one type of game compared to another, etc...

that's the short explanation
 
Brent_Justice said:
you asked how can doom 3 performance tell you about HL2 performance, well, it can't! duh!

that isn't the point

the point is to relate gaming experience, which involves performance and IQ of games

you can't find out the performance/IQ of a game without using that game, there is no ifs ands or butts about that

but by using the latest games out there with the latest gaming engines you can make educated predictions on future games that might be using those engines

you can see trends start to form, Card A may be better at one type of game compared to another, etc...

that's the short explanation
I agree with certain types of game being better on one card, but I can find the theroeticle performance of a card using 3dmark, testing pixal shading speed(which admittitly is memory bandwidth bound) vertex shading, fillrate etc.
 
Moloch said:
That can be arranged, given we both live the same area.

Have a go at it then. It's easy to threaten people over the intarweb.
 
Moloch said:
I agree with certain types of game being better on one card, but I can find the theroeticle performance of a card using 3dmark, testing pixal shading speed(which admittitly is memory bandwidth bound) vertex shading, fillrate etc.

none of that really tells you how a video card will perform in games

look what happened with Doom3, the X800 series has theoretically faster numbers, fillrate etc.. etc...

but look what series of video cards beat it in performance ;)
 
Ya know I was thinking the same thing. How about meeting on the battle field.

Say one on one in far cry?

I would love to be the viewer in that game best out of three?
 
I actually can't think of a reason for why this thread is still open. Its even running out of absurdity entertainment value. It needs to be locked.
 
It makes it really easy to take screenshots. 5 seconds per frame = you never miss a detail.
 
BlackPearl said:
OK, can we get a show of hands on how many gamers are using S3?

I have an S3 Trio64 in my old box that acts as a server now. Does that count? ;)
 
They shouldnt enable it to begin with. There are only two choices for gaming cards right now, an ATi card, or a NV card. S3 doesnt matter. Enabling it by default is bias. 3Dc isnt even in the benchmark, but is in some games, and more in the future. Dont do one, if you arent going to do both. I dont have a problem with it being in the benchmark, but enabling it by default is just wrong. Being too lazy (or other reasons) to put in 3Dc shows a clear lack of trying to make it be a good emulator of gaming performance, to me.

3Dmark hasnt been good since 2001 for me. I am glad [H] uses real games to benchmark. The simple fact that drivers from both ATi and NV raise the score about 1000 points, but doesnt come anywhere near the approx. 25% increase in games that it gets in 3Dmark tells me all I need to know.

Scali, it is ironic how you just registered the same day you got banned from B3D. So [H] wasnt good enough for you till you got banned, and need a place to cry about it, eh? Hows about not using these forums for your own personal agenda.
 
fallguy said:
Scali, it is ironic how you just registered the same day you got banned from B3D. So [H] wasnt good enough for you till you got banned, and need a place to cry about it, eh? Hows about not using these forums for your own personal agenda.

How about we lock it now? =\
 
Ludic said:
How about we lock it now? =\

I second that, i just stopped in to check things out and HOLY $HIT.....why in Gods name has this made it to a second page.....jeeeesus H christ.

I have no intention to understand or interpret what Scali is/was trying to accomplish this/that time.....but It doesn't really matter anymore, this is one of those threads that can really turn you away from a forum for a little while.

This is why I wish I was a mod, because this would have been locked after the first couple posts....when I realised that it had NO place here and had nothing to do with the forum itself. :rolleyes:
 
fallguy said:
They shouldnt enable it to begin with. There are only two choices for gaming cards right now, an ATi card, or a NV card. S3 doesnt matter. Enabling it by default is bias. 3Dc isnt even in the benchmark, but is in some games, and more in the future. Dont do one, if you arent going to do both. I dont have a problem with it being in the benchmark, but enabling it by default is just wrong. Being too lazy (or other reasons) to put in 3Dc shows a clear lack of trying to make it be a good emulator of gaming performance, to me.

According to Futuremark guidelines the feature must be implemented in more than one IHV's card.

Nvidia's DST is implemented in Nvidia and S3 cards = 2 IHVs

ATI's 3dc is only implemented in ATI cards = 1 IHV

Therefore 3DC does not qualify, while DST does qualify. You can't go discounting IHVs that sell videocards just because it supports your argument better.
 
tranCendenZ said:
Therefore 3DC does not qualify, while DST does qualify. You can't go discounting IHVs that sell videocards just because it supports your argument better.

Exactly, it's just not right. You can't say "who cares about S3" when it's still an IHV that needs to be accounted for, which DOES make DST valid. Trying to say anything different "we just shouldn't count them" is just being ignorant.....for shame. :(
 
fallguy said:
They shouldnt enable it to begin with. There are only two choices for gaming cards right now, an ATi card, or a NV card. S3 doesnt matter. Enabling it by default is bias. 3Dc isnt even in the benchmark, but is in some games, and more in the future. Dont do one, if you arent going to do both. I dont have a problem with it being in the benchmark, but enabling it by default is just wrong. Being too lazy (or other reasons) to put in 3Dc shows a clear lack of trying to make it be a good emulator of gaming performance, to me.

3Dmark hasnt been good since 2001 for me. I am glad [H] uses real games to benchmark. The simple fact that drivers from both ATi and NV raise the score about 1000 points, but doesnt come anywhere near the approx. 25% increase in games that it gets in 3Dmark tells me all I need to know.

Scali, it is ironic how you just registered the same day you got banned from B3D. So [H] wasnt good enough for you till you got banned, and need a place to cry about it, eh? Hows about not using these forums for your own personal agenda.

Ok here's the fact of the situation. If it was 3Dc that was enabled by default, you wouldn't complain for a second. You don't care about what is fair, you care about what makes your favorite company look best. If you say otherwise, you're lying, and you know it, end of story.

Seriously, what is 3Dmark supposed to do? not support PS3 and HST because ATi was too lazy to build a new core this year? I think not. If you want to whine about something, go bitch to all the ATi boys over at b3d, you'll find they are much more sympathetic to your cause.
 
^eMpTy^ said:
Seriously, what is 3Dmark supposed to do? not support PS3 and HST because ATi was too lazy to build a new core this year? I think not. If you want to whine about something, go bitch to all the ATi boys over at b3d, you'll find they are much more sympathetic to your cause.

While I do agree with you to an extent here, you REALLY need to watch yourself.....as you've no right to "get" like that, talking down to someone is one way of showing that you think you're better then them in some way, and we don't want that at [H] at all.
 
cornelious0_0 said:
While I do agree with you to an extent here, you REALLY need to watch yourself.....as you've no right to "get" like that, talking down to someone is one way of showing that you think you're better then them in some way, and we don't want that at [H] at all.

what can I say, I call it like I see it...

please excuse my harshness, but my point stands...this isn't a matter of fairness, this is a matter of playing favorites...the title of the thread is "what's going on at b3d"...and that's what's going on...they are a bunch of ATi fans who are looking for something to whine about...I'm not saying that it's wrong, I'm not saying it makes them bad people...I'm just saying: call it what it is.
 
^eMpTy^ said:
what can I say, I call it like I see it...

please excuse my harshness, but my point stands...this isn't a matter of fairness, this is a matter of playing favorites...the title of the thread is "what's going on at b3d"...and that's what's going on...they are a bunch of ATi fans who are looking for something to whine about...I'm not saying that it's wrong, I'm not saying it makes them bad people...I'm just saying: call it what it is.

Alright, I'll give you that.....
 
..., which DOES make DST valid.

Imho a feature which affects Futuremarks reference image isn't valid, period. DST shadows are much rougher then reference shadows. Not a tiny little bit in some scenes, it's noticable in almost every single frame.
If you don't draw a line (my statement) then apple to apple comparisons are hardly possible anymore. What's next?

However, i can understand why Futuremark "allows" DST for Nv; the gap with Ati would simply be too big, they won't disturb "balance" in the graphical world.
 
Apple740 said:
they won't disturb "balance" in the graphical world.

Hehe, yeah.....we don't need anymore "disturbances" involving Futuremark and Nvidia.....well put. :p
 
^eMpTy^ said:
Ok here's the fact of the situation. If it was 3Dc that was enabled by default, you wouldn't complain for a second. You don't care about what is fair, you care about what makes your favorite company look best. If you say otherwise, you're lying, and you know it, end of story.

Seriously, what is 3Dmark supposed to do? not support PS3 and HST because ATi was too lazy to build a new core this year? I think not. If you want to whine about something, go bitch to all the ATi boys over at b3d, you'll find they are much more sympathetic to your cause.

Actually, I would complain if DST wasnt as well. And thats not a lie, no matter what you think. Neither should be enabled, if the other isnt. 3Dc is open, anyone can use it. I can say S3 doesnt matter, because no serious gamer uses them for a card. How many video card sub forums do we have here? How many S3 video card posts are there per month here?

None of this would be a problem if it was just toggle-able, and set to off as default. But the fact that you cant even turn it off without paying for the program, makes apple to apples impossible for most people. And thats the only problem I have with it. I dont really care that 3Dc isnt in the bench, just that DST is enabled by default, and you cant turn it off without paying for the program.
 
Back
Top