1000 boxes/pipes/cans will it bog down your network?

psychoace

[H]ard|Gawd
Joined
Nov 3, 2002
Messages
1,068
I am seriously wondering how it is possible to be able to track all these things in real-time when connected to an online server. Most games don' t have more then 32 players per match now your also talking about tracking 1000 individual items moving totally moving fluently and independent how is it possible? Will you need the card then if it's over a network cause wouldn't someone elses or the servers card be processing the information of physics and not yours? Is there some kind of setup where it detects something being hit or moved and will let your physic card calculate how everything else around it should be moved? If so can't there be bad syncing problems then if there is lag and maybe you missed something and now your ppu calculated a block being in front of you when in fact someone threw it 5miles south while you were lagged. I just thought this would be some good stuff to talk about.
 
Psychoace--

This is one of the main reasons I'm very skeptical about the PPU ever having strong adoption. Cell Factor, which is supposed to be the "killer app" for Ageia-- I've never even seen them claim it will work over the internet. I've only seen claims for LAN play.

With latency, lag, choke, etc., you're quite right in pointing out that synchronizing all of the clients would be a nightmare. It's already extremely hard for "normal" games that have far more limited action going on, like Counterstrike: Source

What CS:S does is a pretty good approach, and probably the best one out there. They go by a "What you see is what you get" from the perspective of the laggard or choked person-- in other words, if you fire your gun while someone is in your sights, even though because of a moment of lag they have actually already slid behind a wall and are safe, it will still apply the hit to them. This enables people who have less-than-crippling lag to play the game in a satisfactory fashion.

It also means that some people complain that they were hit when they "couldn't possibly have been"-- but that same person gets the advantage from their higher ping that if they poke around the corner and see the laggy person, they can fire and hit them before the lagged person actually sees them appear-- also just a 1/10th of a second difference.

Valve implemented it well, but it's still imperfect-- because network connectivity is imperfect.

Turning to Cell Factor and the Ageia chip-- it's possible that they have some incredibly elegant solution for this, but their added complexity seems an insurmountable problem.

If one character with a high ping is using his weird force-like powers on some object to send it flying at another character is lagging, what will happen? Will the physics effect take place before the character can "see' that it would happen? What about the physics effects of the lagged character? If one of them uses what appears to be an attractor power to make a bunch of objects swarm and gravitate to a point, but they're already being affected by a force from an unlagged character, how does each clients PPU calculate what's going on?

It can't. The calculations for such things have to be done, and sychronized, by the server, not the PPUs. To enable bandwith communicationg between the PPUs over a lan is possible, though still highly problematic. To have the PPUs be able to communicate and synchronized over the internet without server mediation seems absolutely immpossible-- and if the physics effects are not sychronized, there will be horrible artifacting, very bizarre-looking pathways, and all sorts of other problems.

The final solution would be requiring the server to calculate all the physics interactions on its PPU and send them out to each client PPU-- this still consumes massive bandwith, and while it partially elimates the lag-- no upload from client time-- it still falls victim to choke and the recieving computer's ability to process the information and distribute it to the PPU in real time.

I seriously doubt that Cell Factor will be playable in any good way over the internet. I have seen nothing from Ageia even beginning to address these problems.
 
There will definitely be issues with games that exploit large scale environmental interaction when implemented on the current generation of network resources. But then agan, that's the kind of problem that will be driving demand for faster networks. I don't see people rushing out ot upgrade ot gigabit networks at home so they can share their printers faster. And while many of us would like it in theory, there isn't a groundswell of demand for 100mbit (or faster) internet connections to the home just to chat and browse amazon faster.

The things that mmorpgs do in real time today were inconceivable in the good ol' days of 14.4 (and before). Exactly the same way that in 10 years LAN gaming environments will be interactive on a scale that is inconceivable today. Times change; games become more interactive; demand for speed increases. It's the circle of life.

There will be an awkward phase for about the next five-ish (I'm hoping more like 3, but who knows) years when the singel plaayer games are going to have access to a new level of interactivity that can't scale well to the network. But networks will catch up, and you'll soon be able to play LAN games that have the same features as the single player titles of the next couple years are going to have.

There will defiitely be titles coming out that will have a "two flavours" approach. In single player mode you'll have tons of real-time environmental effects that don't scale to LAN play (and thus are turned off for LAN play). But that's obviously just going to drive demand for better networks, and more effective algorithms for dealing with the environmental data.

Now, I'm not trying to proselytize for Ageia. PhysX may not be the technology that drives the next wave. It may not start for another couple years. But it will start. And things will change dramatically.

Wow. I'm only 27, but I feel ancient here. The question:

How can technology X take off when we haven't seen anything like it before and the current technology Y can't handle it?

is an indicator of someone lacking long-term perspective. The answer is always driven by one immutable truth: the act of asking that question makes it obivous that technology Y is obsolete. And, while technology X might fizzle before a successor to technology Y emerges, the inevitable truth is that one day much further in the future they will both be looked back upon as being quaint and simplistic.
 
I think your looking at the wrong end of the pipe here. It's not the players that are to slow it's going to be the server's that wont be able to push out all this network information to 10 players on 1 server. It's tough for them right now to have 32 players on one server how is 1000 objects going to work out in any way? Yes the server will need a ppu card otherwise the ppu info will not be able to be displayed. I think though Ageia will force you to have a ppu any ppu "enhanced" game just to be mean. Maybe though the card will just do effect physics online.
 
The important thing would be that the outcome of any physical events is kept up-to-date on all of the clients. As in many MMORPGs, the client would be responsible for maintaining intermediate display states, and the server providing periodic refreshes (determined by bandwidth) to keep the clients in-sync. This way a client computer can lack a physics processor, and still show appropriate results, even though the intermediate frames will lack the deatils. A client computer with a physics processor could do it own calculations, and even request less frequent updates from the server.
 
I was thinking soemthing along those lines, with clients being given a "this event happened" message then being responsible for rendering the event and it's aftermath (even though they are done independently, every machine should arrive at the same end result), and then confirm that result with the server. Of course, this would mean the server would have to render (I say render, but really mean calculate) the event as well. Perhas this will see a move to decentralized games? A central servef confirming important thigns like player positions and actions, and clients keepign track of the environment mostly on thier own?
 
darrella said:
The important thing would be that the outcome of any physical events is kept up-to-date on all of the clients. As in many MMORPGs, the client would be responsible for maintaining intermediate display states, and the server provides a refresh to stay in-sync (periodically determined by bandwidth). This way a client computer can lack a physics processor, and still show appropriate results, even though the intermediate frames will lack the deatils. A client computer with a physics processor could do it own calculations, and even request less frequent updates from the server.

I don't think it's that simple. psychoace's original question is a VERY important one. Take this example: A bunch of players in a FPS in a big courtyard when a shrapnel bomb goes off in the middle. Using n old-school game engine, the blast is a simple spherical phenomenon. Damage varies as a function of distance from the center. Using a PhysX engine, the blast is actually modeled as 5000 pieces of metal - possibly with different sizes and velocities. Statistically, your "expected damage" is still determined by distance from the blast, but you might get lucky and come away relatively unscathed, or you might take one chance piece of shrapnel in the eye even if you are the farthest away at the time.

A game of this type is arguably more realistic, and could end up being perceived as a compelling reason to use physics in -game. (You could also cheat in game development, and just use random variables generated at the server to model the non-uniform damage, but there are other game types where such a hack isn't a viable option.)

Also, you really can't just dump it on the server as you suggest, because the level of detail that such a model requires is that it matters if the piece of shrapnel hits you in the neck or shoulder. That's not an issue if you are modeling shrapnel with infinite velocity (like bullets in current games), but if you are modeling physics with finite velocities, you can't jsut predict the outcome server-side and broadcast it. After all, If you're jumping off a roof when the bomb goes off, the difference between getting hit in the arm and the neck is less than a ping. (Again, that's not a big deal if particles have infinite velocity, but with realistic physics, there's a LOT less you can predict with certainty beforehand on the server side.)

This kind of scenario would require every client to have the same data set (the inital sizes and velocities of each piece of shrapnel) sent to them within a very tight time window for the game interactions to work properly. Certainly modeling the actual trajectories can be done client-side, but there is a need to "announce" extremely large data sets (data sets that are orders of magnitude larger thanwhat current game engines require) to the whole network very quickly for lag not to be an issue. The latency and bandwidth needs for such a game seem to be beyond the capability of current network/server abilities.
 
Daerella, shadowwyvern,

Both of those resolutions would be highly problematic.

If the game is entirely reliant on the server to process all of the physics related information, then the server will have to be, well, massively superpowered, or the physics effects would have to be basically non existence. The PPU communicates through the PCI bus-- even if it could manage to channel that information through the PCI bus to the CPU fast enough, the CPU would be overwhelmed by the 12, 24, 36, whatever client-based physics events happening simultaneously.

Then the CPU would have to transfer that information out again to the network card and disperse it to the clients-- if it was doing a complete "this is the physcs state of the world" refresh every time, that would be a massive burst of information. If it was doing only a "this is what has changed since last time" burst, they'd have to come very very rapidly in order for the physics to be simulated properly. If it was doing just a "here's a discrete physical event" (grenade hits wall) now calculate its ieffects" bit of information to each PPU, leaving the PPU on its own to calculate it-- that's probably the best solution but only works if there is only 1 brand of PPU chip and one driver for it. If the PPU is successful, there will be far more than on brand. ANd if the clients then have to verify with the server that they've done it correctly, that will also be far too taxing.

I don't see any way around this problem aside from massively increased bandwith, both internal and external to the computer. That will come, I'm sure, but right now, as any online player knows, latency and lag-producing effects are things to be avoided at all costs.
 
If the game is entirely reliant on the server to process all of the physics related information, then the server will have to be, well, massively superpowered, or the physics effects would have to be basically non existence. The PPU communicates through the PCI bus-- even if it could manage to channel that information through the PCI bus to the CPU fast enough, the CPU would be overwhelmed by the 12, 24, 36, whatever client-based physics events happening simultaneously.
I don't see how this differs from what each client would already have to be doing as well. Isn't this why a PPU is supposed to be so powerful?

I agree that the issue is problematic. Both accuracy and efficient use of bandwith are necessary. I really think it will come down to the creativity of the programmers to meet this challenge. One will not be able to just add massive physics to an existing online game/engine without considering these issues, however IMO if they are taken into consideration there is a lot that is doable.
 
darrella said:
I don't see how this differs from what each client would already have to be doing as well. Isn't this why a PPU is supposed to be so powerful?

I agree that the issue is problematic. Both accuracy and efficient use of bandwith are necessary. I really think it will come down to the creativity of the programmers to meet this challenge. One will not be able to just add massive physics to an existing online game/engine without considering these issues, however IMO if they are taken into consideration there is a lot that is doable.

But what if you get that pack of information from the server saying a bomb exploded at point 320,3000 or something. Will each PPU calculate the same physics and handle all the particles the same way for every player? Also what about the physics in cellfactor with all the pipes and moving them around. That was about a few hundred independetly moving items that each moved and functioned in there own way how can the network keep up with where each one is positioned and what angle it's at and what speed it's moving and if it hit anybody.
 
I don't think anybody will really try, unless they're being heavily pushed(paid) by Ageia. 3dfx was good at this when they rolled out the gfx boards.

It will be hard enough to get developers to add real support for game physics (instead of eye candy that can be turned on or off, allowing the game to easily support those with/without addin PPU boards), much less network code that can robustly support this type of stuff.

There just won't be enough of a valid target audience any time soon to justify the expense.
 
I think it breaks down like this (i very well might be wrong)

The clients don't calculate PPU effects, that would be done on the server to make sure it all stays in sync (especially important if physics objects can do damage)

The clients recieve the data about what all the objects are doing and render that on screen.

each client shouldn't need a PPU since they're not doing the calculations they're simply displaying the pre-calculated data.

Any ping correction calculations can be applied by the server before it uses the clients data to alter the state of the physics so it shouldn't cause any nasty effects

The main downside of which, will be the amount of bandwidth needed to run a server, most clients might be able to keep up now we're starting to see faster connections (all of the UK recently got upgraded to 8Mbit and lets face it most games today only require 15-20k/s max anyhow) even pushing that to 100's of k/s should be fine for even 1Mb broadband users.

I have no idea how much data that is uncompressed but I'm guessing most broadband connections could cope with it?

there is a decent upside to that however, and that is games can be sold with PPU support for multiplayer and only the server needs a PPU, all the clients have the benefits of gameplay with these physics without having to spend the cash on the PPU card.
 
Frosteh said:
there is a decent upside to that however, and that is games can be sold with PPU support for multiplayer and only the server needs a PPU, all the clients have the benefits of gameplay with these physics without having to spend the cash on the PPU card.
Good luck seeing widespread adoption of PPUs in servers. There's often just not enough room for another card or any card at all. Also, there could be heat related issues if the PPU puts out a lot of heat.
 
dotK said:
Good luck seeing widespread adoption of PPUs in servers. There's often just not enough room for another card or any card at all. Also, there could be heat related issues if the PPU puts out a lot of heat.

Dunno what servers you're renting or building but they're not very good.

Any well built server/server room should have more than sufficient cooling to handle an extra PPU card.

If the market needs this as standard in game servers over the next few years then expect them to support it, those who don't are likely to run into financial troubles if they can't even keep up with the hardware games require.
 
Obdicutmodo said:
Cell Factor, which is supposed to be the "killer app" for Ageia--
'Killer app,' what? Where did you get that idea?

Ageia never advertised it as such. In fact, CF is nothing more than a tech demo thrown together in the span of a few weeks, so Ageia had something to show off PhysX at GDC. Ageia has never claimed otherwise.

CF seemed to wow folks at GDC, Ageia decided to put it into the wild so people could play something that actually made good use of the PPU. They have never stated CF is going to be a full blown game, to the contrary. However, because people seems impressed with this little demo, they have left the door open to developing it into an actual game. *IF* CF gets developed into a game, it could be a killer app, but that would't happen for a year or two. More than anything, Ageia is releasing the CF tech demo hoping that modders go balz out with it.

I agree there are a lot of issues to be ironed out using physX in a multiplayer environment, but these problems won't exist of single player titles.
 
Low Roller said:
However, because people seems impressed with this little demo, they have left the door open to developing it into an actual game.

Kinda like Far Cry in that respect, which happened to be a demo to sport shader effects for Nvidia, became independent and on release happened to run on ATIs video cards a lot better, funny that :)
 
I don't think this is a PPU problem, we just need our ISP and network manufact. to stop being so slow and lazy and up the bandwidth already. 512 up and 5+ down isn't going to cut it.

If you what Diggnation, the podcast of Kevin Rose and his friend, they mentioned how millions of dollars were "invested" in ensuring we would have about 50mb/second by 2005 or 2006, well, and now we're NO WHERE close.

So this just puts the bottleneck on the ISPs, it's that simple.
 
Low Roller said:
'Killer app,' what? Where did you get that idea?

Ageia never advertised it as such. In fact, CF is nothing more than a tech demo thrown together in the span of a few weeks, so Ageia had something to show off PhysX at GDC. Ageia has never claimed otherwise.

CF seemed to wow folks at GDC, Ageia decided to put it into the wild so people could play something that actually made good use of the PPU. They have never stated CF is going to be a full blown game, to the contrary. However, because people seems impressed with this little demo, they have left the door open to developing it into an actual game. *IF* CF gets developed into a game, it could be a killer app, but that would't happen for a year or two. More than anything, Ageia is releasing the CF tech demo hoping that modders go balz out with it.

I agree there are a lot of issues to be ironed out using physX in a multiplayer environment, but these problems won't exist of single player titles.

Cell Factor was originally announced for the 360 last year and later the PC. So its not 'just a tech demo', it was in developement long before it got Ageia's support.
 
Skirrow said:
Cell Factor was originally announced for the 360 last year and later the PC. So its not 'just a tech demo', it was in developement long before it got Ageia's support.
From CGOnline's recent interview with Artificial Studios:
CGOnline: How did you come up with the concept of CellFactor: Combat Training?

Artificial Studios: In mid-January 2006, we decided to develop a PhysX-oriented multiplayer FPS with Ageia, and worked out some interesting mechanics to allow players to use physical objects simultaneously with their gunplay. But we needed an underlying story and visual style to go with the anticipated game play. The production team (Immersion Games) that we work with had an interesting storyline for a single player action-adventure FPS called CellFactor. So we adapted background elements from the CellFactor storyline, combining them with the game play mechanics we'd worked out for a multiplayer game, becoming CellFactor: Combat Training. The entire development cycle for the CellFactor: Combat Training demo took 2 months, from mid-January to mid-March.

Link
So, as a PhysX showcase, which is the context of my comment you quoted, CellFactor is indeed 'just a tech demo.'
 
Frosteh said:
Dunno what servers you're renting or building but they're not very good.

Any well built server/server room should have more than sufficient cooling to handle an extra PPU card.

If the market needs this as standard in game servers over the next few years then expect them to support it, those who don't are likely to run into financial troubles if they can't even keep up with the hardware games require.
The 2U case I used to have had a whopping 2 fans for air circulation and they were partially obstructed by the hard drives. It got fairly warm when under load.
 
nonlnear said:
I don't think it's that simple. psychoace's original question is a VERY important one. Take this example: A bunch of players in a FPS in a big courtyard when a shrapnel bomb goes off in the middle. Using n old-school game engine, the blast is a simple spherical phenomenon. Damage varies as a function of distance from the center. Using a PhysX engine, the blast is actually modeled as 5000 pieces of metal - possibly with different sizes and velocities. Statistically, your "expected damage" is still determined by distance from the blast, but you might get lucky and come away relatively unscathed, or you might take one chance piece of shrapnel in the eye even if you are the farthest away at the time.
Until network technologies and coding has improved to the point where it is feasible to transmit information on thousands of objects for 'real' physics in real-time, I'd imagine that multiplayer gaming will continue to use the 'old-school' collision detection method that games currently use (like the blast sphere). People with PPUs could use them in multiplayer, but additional effects will be cosmetic only and layered on top of the old-school model.

Physics will evolve: perhaps games will take more factors into account such as the shape and velocity of the explosive when it detonates. Then the game could have a more realistic 'blast sphere' for collision detection. That'll be far easier than trying to calculate, compress, and transmit every piece of shrapnel over the network in real-time (which, after reading this thread, seems like a revolutionary feat to accomplish).
 
Low Roller said:
From CGOnline's recent interview with Artificial Studios:So, as a PhysX showcase, which is the context of my comment you quoted, CellFactor is indeed 'just a tech demo.'

Sure the Cellfactor: Combat training demo may just be a tech demo. But the Cellfactor game is still on the way.

Anouncement from may 2005 here

http://www.gamershell.com/news/22185.html

original screenies here from July 2005 here.

http://uk.gamespot.com/xbox360/action/cellfactor/screenindex.html

Wouldnt suprise me if we dont see a single player demo released soon after E3. The 'tech demo' was prolly released to the public so people actually have something to show off there cards with instead of having to wait months till the first proper games are available. Unless the original game has been cancelled of course and i missed the announcement.

Anyhow, with regards to the original topic. I reckon that there will be a few problems with so many objects flying around. Even if each computer has the same parameters when it comes to simulating an explosion, the amount of variables is huge and there will always be differences with regards to spread patterns and collisions. On some machines one object may intersect another at a different angle and ricochet completely different to the same event on another pc. Unless the collision detection is 100% perfect and i'm talking subpixel accuracy, then i can see there being problems.

That would make using debris as cover an issue as it may not be in the same place on another pc. Tracking player movements is pretty demanding, but when you throw in the trajectories of 1000's of boxes, pipes, debris etc. The bandwidth usage will be insane. LAN may be able to cope, but internet play prolly wont work. I have a 10mbit connection with 12288kb/s downstream but only a 384kb/s upstream. No way my connection could handle it. It'd prolly take 3 seconds to upload enough positional data for 1 second of gameplay.

I actually only see a PPU as being really effective for single player only for now. Unless someone pulls a super network algorithm out there ass, i just dont see it as feasable
 
Given the discussion above, it would seem that the PPU will only be really effective for single player games. It seems to me that the physics solutions offered by ATI/Nvidia are far more attractive if you are more focused on net multiplayer games like myself, i.e. I dont need to waste money on a 300 PPU because my video card will be able to produce the same visual effects.

Thats what it really comes down to. Ageia has marketed the PPU as something that will allow thousands of static objects to be influenced in real time. So why should I buy a PPU if im only getting visual effects as opposed to real time physics promised by Ageia?

Of course it may well be worth it if your're a single player buff.
 
Skirrow said:
Sure the Cellfactor: Combat training demo may just be a tech demo. But the Cellfactor game is still on the way.
Read both parts of the interview I linked. There's a lot of good info in it.
It's worth noting that CellFactor: Combat Training is at heart just a really fun demo, and that while AGEIA will be adding some new content to it, we currently have no plans to develop it into a full game -- unless perhaps there is suitable demand :)
Were talking about two different things. This and most other threads here talking about CellFactor have been in relation to PhysX. Someone referred to CellFactor supposedly being Ageia's 'killer app,' but that's never been what Ageia, Artificial Studios, or Immersion Games had in mind for this demo.

Ageia is putting this out there because the content supporting their card is very limited. For the same reason, they are also trying to get modders to go nuts with CF.
 
nonlnear said:
I don't think it's that simple. psychoace's original question is a VERY important one. Take this example: A bunch of players in a FPS in a big courtyard when a shrapnel bomb goes off in the middle. Using n old-school game engine, the blast is a simple spherical phenomenon. Damage varies as a function of distance from the center. Using a PhysX engine, the blast is actually modeled as 5000 pieces of metal - possibly with different sizes and velocities. Statistically, your "expected damage" is still determined by distance from the blast, but you might get lucky and come away relatively unscathed, or you might take one chance piece of shrapnel in the eye even if you are the farthest away at the time.

...You can't jsut predict the outcome server-side and broadcast it. After all, If you're jumping off a roof when the bomb goes off, the difference between getting hit in the arm and the neck is less than a ping. (Again, that's not a big deal if particles have infinite velocity, but with realistic physics, there's a LOT less you can predict with certainty beforehand on the server side.)

I'll address each paragraph separately. For the first, while that sounds great, and could maybe be done for a single explosion to demo a technology, I highly doubt a server could handle calculating the entire travel path from start to finish for 5000 objects every time an explosion occurred considering the frequency with which explosions can be occuring during a multiplayer game(Q3A: RA3? lol). Add to that bullet fire, collidable objects moving about, and not only is the server going to struggle to keep up but it will require an enmormous amount of bandwidth.

To the second; The hit simply won't be reported until it occurs(which, given the speed of the shrapnel would not be much time at all). However, all hit determination will still be performed server-side with the standard inputs from the client. For any given piece of shrapnel the server would calculate where it *will* be at any point during its "life"(at some point it will hit a wall or the ground and stop moving). That data is stored for the brief amount of time it takes for the life-span of any piece of shrapnel from a given explosion to expire. If any model passes through the "path" (expressed as an equation to draw a line in 3-D space), the server would then calculate if it did so at a time such that it would get hit, and from there, where the model was hit.

In addition, it would be far easier to simulate this effect using random hit detection, object occlusion etc instead of actually calculating the paths of 5000 objects which will appear in maybe 3-5 frames?
 
Spewn said:
I'll address each paragraph separately. For the first, while that sounds great, and could maybe be done for a single explosion to demo a technology, I highly doubt a server could handle calculating the entire travel path from start to finish for 5000 objects every time an explosion occurred considering the frequency with which explosions can be occuring during a multiplayer game(Q3A: RA3? lol). Add to that bullet fire, collidable objects moving about, and not only is the server going to struggle to keep up but it will require an enmormous amount of bandwidth.

If it's shown in demos today, it will be in almost every title within five years. And your statement about bandwidth was my point exactly.

To the second; The hit simply won't be reported until it occurs(which, given the speed of the shrapnel would not be much time at all). However, all hit determination will still be performed server-side with the standard inputs from the client. For any given piece of shrapnel the server would calculate where it *will* be at any point during its "life"(at some point it will hit a wall or the ground and stop moving). That data is stored for the brief amount of time it takes for the life-span of any piece of shrapnel from a given explosion to expire. If any model passes through the "path" (expressed as an equation to draw a line in 3-D space), the server would then calculate if it did so at a time such that it would get hit, and from there, where the model was hit.

That's a much better concept of how an explosion could be modeled in-game and still be (somewhat) tractable. I hadn't really thought out in detail what would be done server-side/client-side earlier.

Essentially that boils down to all the interactive physics being done server-side, and effects physics rendered client-side. I don't know why that wasn't painfully obvious to me from the start, but that's how it has to work - unless there were a different network paradigm - i.e. an infiniband (or some other very low latency interconnect) connection between the PPUs which calculate interactive physics cooperatively. And yes, I know it's never going to happen.

In addition, it would be far easier to simulate this effect using random hit detection, object occlusion etc instead of actually calculating the paths of 5000 objects which will appear in maybe 3-5 frames?

It will take time, but this is the inevtable direction of the technology. Kind of like how early 3D games used multiple 2D sprites for characters. Because the horsepower wasn't quite there to render everything in 3D. Now that 3D rendering has caught up to what was once a pipe dream, using multiple sprites seems kludgy for many applications. Soon it will be a bad memory.

Smae thing for in-game physics. At first, we'll see full-blown physics for a few key events in games. As the hardware gets faster (and wider), and software implementations mature, we'll look back on some of the non-physics based solutions as quaint.
 
nonlnear said:
It will take time, but this is the inevtable direction of the technology. Kind of like how early 3D games used multiple 2D sprites for characters. Because the horsepower wasn't quite there to render everything in 3D. Now that 3D rendering has caught up to what was once a pipe dream, using multiple sprites seems kludgy for many applications. Soon it will be a bad memory.

Smae thing for in-game physics. At first, we'll see full-blown physics for a few key events in games. As the hardware gets faster (and wider), and software implementations mature, we'll look back on some of the non-physics based solutions as quaint.


The difference is this; Graphics rendering required an entirely new approach to the issue. Non real-time rendering far surpassed what we could do in real-time, but even IT was crappy back then(remember FF7's "amazing" pre-rendered scenes? lol). Why has it gotten better? I can tell you that 3dsMax doesn't produce the scenes it does today because my hardware is better. I could load up 3dsmax 8 on a P150 and render the exact same scene, it would just take an eternity. The issue with graphics was complex; they didn't have an "equation"(or even a set of them) to start from, into which one could plug numbers like mass, kinetic energy, velocity(both angular and linear), and then recieve a perfect(photorealistic) result. With physics, you CAN achieve real-world simulation quality results, the equations are there, it's getting them processed IN TIME that is the hard part now.

What you see as a new avenue in computing, I see as something that will one day be relegated to either an extra chip on the MB, an extra core on the CPU, or an extra core on the Video Card. The physics equations in question are VERY simple and can all be done by pretty much any grade 12 physics student(assuming they've paid attention in class). I, personally, see the physics processor going the way of the math co-processor. They're both things which should inevitably be added to typical CPU design anyway, the first was, we'll see how many years it takes for the next one.
 
dotK said:
Good luck seeing widespread adoption of PPUs in servers. There's often just not enough room for another card or any card at all. Also, there could be heat related issues if the PPU puts out a lot of heat.

Nahhh, the real problem is with bandwidth.

Right now, FPS games are bandwidth misers because most of the map is static. Only a handful of things are allowed to be modified by the user. Thus, for any given update, you only have a few dozen things to keep track of.

Now, take a look at that Cell Factor demo. Imagine the bandwidth load of sending updates pertaining to HUNDREDS of objects, perhaps THOUSANDS of objects. Once users are encouraged to damage the map, or manipulate the many objects on that map, they will do so en mass, and kill the server's fat pipe.
 
Think MMORPG style server-wide(or multi-server, as the case may be) bandwidth use, but with only 16 players on a single map.
 
defaultluser said:
Nahhh, the real problem is with bandwidth.

Right now, FPS games are bandwidth misers because most of the map is static. Only a handful of things are allowed to be modified by the user. Thus, for any given update, you only have a few dozen things to keep track of.
Even with alot of physics effects, I still don't think games would push the limits of today's broadband and LAN bandwidth. With thousands of physics objects, I'd imagine compression would be necessary for packeted network traffic - latency would remain the more important consideration.
 
Spewn said:
I'll address each paragraph separately. For the first, while that sounds great, and could maybe be done for a single explosion to demo a technology, I highly doubt a server could handle calculating the entire travel path from start to finish for 5000 objects every time an explosion occurred considering the frequency with which explosions can be occuring during a multiplayer game(Q3A: RA3? lol). Add to that bullet fire, collidable objects moving about, and not only is the server going to struggle to keep up but it will require an enmormous amount of bandwidth.

The server would need a PPU to do this, then the amount of calculations suddenly don't matter, any ping compensations thats applied to clients actions would simply be applied before the relevant information is sent to the PPU.

To the second; The hit simply won't be reported until it occurs(which, given the speed of the shrapnel would not be much time at all). However, all hit determination will still be performed server-side with the standard inputs from the client. For any given piece of shrapnel the server would calculate where it *will* be at any point during its "life"(at some point it will hit a wall or the ground and stop moving). That data is stored for the brief amount of time it takes for the life-span of any piece of shrapnel from a given explosion to expire. If any model passes through the "path" (expressed as an equation to draw a line in 3-D space), the server would then calculate if it did so at a time such that it would get hit, and from there, where the model was hit.

Again these calculations can be offloaded onto the PPU, the whole point of the PPU is to do these calculations in hardware so many can be done very fast.

In addition, it would be far easier to simulate this effect using random hit detection, object occlusion etc instead of actually calculating the paths of 5000 objects which will appear in maybe 3-5 frames?

Easier? yes...

Better? I don't think so, more realistic explosions means more realistic gameplay which means more tactics and more fun, being able to pick up a barrel and place it between you and an explosion and hide behind it to be protected from a grenade.

Better yet we can all do what we've wanted to do for YEARS team orientated multiplayer games, and thats dive on top of grenades to save team mates at the expense of gibbing yourself :D

Even with alot of physics effects, I still don't think games would push the limits of today's broadband and LAN bandwidth. With thousands of physics objects, I'd imagine compression would be necessary for packeted network traffic - latency would remain the more important consideration.

Upload wouldn't need much, only what the client does as he interacts with the objects he specifically pushes/pulls or shoots etc.

Download would be more important but we all have more download bandwidth than upload, I can't see physics data maxing out decent broadband connections, and bandwidth gets cheaper not more expensive, server would need a LOT more bandwidth, but that doesn't make them unrealistic to have. Hell downloading game demos now a days can be upto 700mb, I remember when I was shocked at the massive install of 200mb for all of fallout (iirc)

Times are always changing.
 
Spewn said:
The difference is this; Graphics rendering required an entirely new approach to the issue. Non real-time rendering far surpassed what we could do in real-time, but even IT was crappy back then(remember FF7's "amazing" pre-rendered scenes? lol). Why has it gotten better? I can tell you that 3dsMax doesn't produce the scenes it does today because my hardware is better. I could load up 3dsmax 8 on a P150 and render the exact same scene, it would just take an eternity. The issue with graphics was complex; they didn't have an "equation"(or even a set of them) to start from, into which one could plug numbers like mass, kinetic energy, velocity(both angular and linear), and then recieve a perfect(photorealistic) result. With physics, you CAN achieve real-world simulation quality results, the equations are there, it's getting them processed IN TIME that is the hard part now.

You seem to be saying taht physics modeling speed is perfectly analogous to rendering. Which was my point. Am I missing something?

What you see as a new avenue in computing, I see as something that will one day be relegated to either an extra chip on the MB, an extra core on the CPU, or an extra core on the Video Card. The physics equations in question are VERY simple and can all be done by pretty much any grade 12 physics student(assuming they've paid attention in class).

Actually, the equations involved in rendering are much simpler. Vector dot products and cross products are much simpler than numerical simulations of viscoelastic motion. Of course, if you're using an exploicit algorithm (which I think most game engines would), that's just matrix multiplication, but implicit methods require inversion (well, you actually just do some sort of elimination, as inversion is a BAD idea).

I, personally, see the physics processor going the way of the math co-processor. They're both things which should inevitably be added to typical CPU design anyway, the first was, we'll see how many years it takes for the next one.

I think that the physics processor might end up being integrated onto the CPU. But only if CPU design ends up taking some very particularturns i its long term development path. In fact, your hypothetical proposition is of the same nature as the old saying that 3D graphics will eventually be incorporated onto the CPU. It didn't happen, but in a different world it might have.

The case for moving physics calculations onto the CPU in the next 20 years seems very weak to me. the nature of the architecture is so completely different that it's practically unworkable. For the kind of parallelism that long vector calculations (liek physics) require, the nature of caching changes. What you need is a small (well bigger is better, but you don't need a ton) code cache, and massive bandwidth for your data. Architecturally, that looks a lot like a GPU...

What I see as being more likely is the physics calculations moving onto the graphics card. This would only require two things to change:

(i) graphics pipelines would have to become fully programmable - with no program length limit.
(ii) graphics memory access rules would have to change slightly (I think - I'm not really sure about the details, but I'm concluding that if it were possible to move physics calculation results off the graphics card efficiently that Nvidia and ATI would already be doing interactive physics.)

Both of these things might happen very soon.
 
As I just said:

IMHO, the best place for the physics calculations to go (assuming that discrete cards die out) is the GPU, not the CPU. This requires the twothings I'd mentioned. (Honestly, I don't know if number 2 has happened yet.)

1. unlimited GPU program length.
2. more generic (ie bidirectional) GPU memory access.

AFAIK, ATI hasn't adopted infinite program length, but it can't be far off. I think Nvidia is already there. Sorry, it's been a while since I followed GPU architecture in detail.

Really the only barriers I see to moving physics to the GPU are political. I don't believe that the Havok business model can compete against a free SDK. Seems to me like Ageia is ripe for a buyout - if only their business model weren't tied to selling discrete hardware... But that might not be so bad either.

Imagine Nvidia buying Ageia. PhysX implementations directly integrated into Nvidia cards, and discrete PhysX cards sold separately for:

- people who don't have NVidia cards
- people who have high-end Nvidia rigs and want all their GPU power for graphics
- people who want a discrete physics card for other applications (if such applications ever take off)

Of course, this could happen with any graphics compnay doing the Ageia buyout, but I'd hope for Nvidia if anyone. Well, maybe a comeback for Matrox would be nice, but it ain't gonna happen.

Sorry if this is getting off topic.
 
Frosteh said:
The server would need a PPU to do this, then the amount of calculations suddenly don't matter, any ping compensations thats applied to clients actions would simply be applied before the relevant information is sent to the PPU.



Again these calculations can be offloaded onto the PPU, the whole point of the PPU is to do these calculations in hardware so many can be done very fast.
My question was whether or not a server with a single PPU card could handle a level the size that I'm used to playing and all the physics that would go with it(if you included everything from the tech demos). Nobody seems to be able to answer that, or even get the question.

Better? I don't think so, more realistic explosions means more realistic gameplay which means more tactics and more fun, being able to pick up a barrel and place it between you and an explosion and hide behind it to be protected from a grenade.

Better yet we can all do what we've wanted to do for YEARS team orientated multiplayer games, and thats dive on top of grenades to save team mates at the expense of gibbing yourself :D

Simple object occlusion(and in-fact, you could add object density, tensile strength, etc, to determine if the hit can pass through that object or not, and if so how far) would take care of *both* of those cases. I'm not talking about explosions which cause vehicle tires, boxes, barrels, whatever to fly about, I'm talking about actually modelling each fragment from a bomb's casing. You won't be able to see the bomb explode and then DODGE the shrapnel. You *likely* won't even see the shrapnel. The only reason you can "see" bullets in some of todays games is because of coding.

Even over long distances(hundreds of meters) bullets travel fast enough that you can't see them. IFF you're firing tracer rounds AT NIGHT(which work via a phosphor coating and air friction, so in-fact, they don't "light" for a fair distance after they leave the barel. Much closer than 80% of FPS engagements. So what benefit would actually modelling the path of all those pieces of shrapnel provide? What you seem to be misunderstanding is that it would be the exact same result to simple calculate using the existing method but use randomness and object occlusion to simulate explosion shrapnel on the small scale(grenades, small bombs, etc. Large shrapnel, like an entire car door, is a different story. Of course, you also won't have 5000 car door's fly off of a car when you blow it up...). In-fact, you could even add in a time delay for models that are further from the center of the explosion, and provide the calculations necessary to determine if the model got out of the way in time or not. Figure an explosion might affect 5 players at once(if you get a good shot), that's a lot less bandwidth(both network and ppu) than calculating 5,000 object paths, then figuring in occlusion, then scoring the hit.

Edit: This is something called "Indistinguishability". As far as I'm concerned, if there's no way for me to know that something is different from something else, then they are the same. Before you call me naive, understand that science(though more specifically, quantum mechanics) follows the same rules.
Download would be more important but we all have more download bandwidth than upload, I can't see physics data maxing out decent broadband connections, and bandwidth gets cheaper not more expensive, server would need a LOT more bandwidth, but that doesn't make them unrealistic to have. Hell downloading game demos now a days can be upto 700mb, I remember when I was shocked at the massive install of 200mb for all of fallout (iirc)

Times are always changing.

I think the biggest problem in this thread is that everyone is assuming I'm saying these things aren't possible. What I'm saying is this: I haven't seen Aegia address these issues yet, and technology company's are usually not quiet about things that work, but they're very quiet when something *doesn't* work.
 
Spewn said:
My question was whether or not a server with a single PPU card could handle a level the size that I'm used to playing and all the physics that would go with it(if you included everything from the tech demos). Nobody seems to be able to answer that, or even get the question.

I guess I haven't really been talking about the original question. I've been going off about longer term outlooks, when there is an immediate (and important) question. Frankly, I think the answer is "maybe, sometimes". It's going to really depend on a particular game's implementation.

Simple object occlusion. ... So what benefit would actually modelling the path of all those pieces of shrapnel provide?

What you seem to be misunderstanding is that it would be the exact same result to simple calculate using the existing method but use randomness and object occlusion to simulate explosion shrapnel on the small scale(grenades, small bombs, etc. Large shrapnel, like an entire car door, is a different story. Of course, you also won't have 5000 car door's fly off of a car when you blow it up...). In-fact, you could even add in a time delay for models that are further from the center of the explosion, and provide the calculations necessary to determine if the model got out of the way in time or not. Figure an explosion might affect 5 players at once(if you get a good shot), that's a lot less bandwidth(both network and ppu) than calculating 5,000 object paths, then figuring in occlusion, then scoring the hit.

It was a stretch for me to say that finite velocity matters for shrapnel from a grenade. But there will be events where it's a big deal. When things are flying slowly enough, you won't be able to do simple occlusion from a snapshot of the world at one point in time. You've got to model the dynamics. That's the whole point of the PPU - to let you do that in cases where you couldn't before.

True, a grenade explosion wasn't a good example. It was the first thing I could think of when I wanted an example of lots of objects moving around. It was just an easily accessible example of how something is approximated in current games by simplistic approximations. The point I was trying to make was that there might be new game dynamics available which previously weren't due to the computational scale of the models involved.

Here's a better example: An FPS scenario involves a seismically unstable area (just after an earthquake) with tall buildings. Your mission is to destroy/disable counterweights on the roofs of two key buildings while aftershocks are occuring. Modeling the elastic reactions of the buildings completely with varying degrees of counterweighting, or even with different modes of malfunctioning, is one possible application of a PPU that would affect gameplay for a significant number of players (for example, on 50 floors of a building) over a long timescale. You can't occlude the problem away. You have to model it. If a building "snaps" and falls over to one side, you might survive if you're in the right part of the "stump". (Snapping "like a twig" isn't normal for buildings, but if one had sabotaged a counterweight appropriately, it's conceivable.)

That's probably a better example.

Edit: This is something called "Indistinguishability". As far as I'm concerned, if there's no way for me to know that something is different from something else, then they are the same. Before you call me naive, understand that science(though more specifically, quantum mechanics) follows the same rules.

I completely agree with you.

I think the biggest problem in this thread is that everyone is assuming I'm saying these things aren't possible. What I'm saying is this: I haven't seen Aegia address these issues yet, and technology company's are usually not quiet about things that work, but they're very quiet when something *doesn't* work.

I guess the reason that I haven't been attacking the original question directly is that there really isn't much to go on. There is a very salient question here: is the PhysX PPU going to be able to deal with multiplayer gaming with interactive physics enabled? If so, to what extent? The deafening silence from Ageia that you take as evidence to the negative I read quite differently. I think the answer depends completely on game developers. Implementing interactive physics effects for a multiplayer LAN game is a big problem - even without current network limitations. Getting any degree of playable physics interaction over even gigabit ethernet will take some very delicate balancing of how messages get passed, and how, when, and where computations are performed. Reaching an effective solution to this is going to take more than a couple months. And it's not really Ageia's job. They have a vital role to play in catalysing the processs, but Ageia really can't do it alone. Ageaia is not in the business of writing game engines. That's why I think we haven't heard much from Ageia on this point.
 
nonlnear said:
Here's a better example: An FPS scenario involves a seismically unstable area (just after an earthquake) with tall buildings. Your mission is to destroy/disable counterweights on the roofs of two key buildings while aftershocks are occuring. Modeling the elastic reactions of the buildings completely with varying degrees of counterweighting, or even with different modes of malfunctioning, is one possible application of a PPU that would affect gameplay for a significant number of players (for example, on 50 floors of a building) over a long timescale. You can't occlude the problem away. You have to model it. If a building "snaps" and falls over to one side, you might survive if you're in the right part of the "stump". (Snapping "like a twig" isn't normal for buildings, but if one had sabotaged a counterweight appropriately, it's conceivable.)
That's a good example of revolutionary gaming physics. I think most most people are reluctant about hardware physics acceleration partly because people are still thinking in terms of the relatively primitive physics that games employ nowadays - which are for the most part isolated incidences in a static enviroment. A grenade explosion in a room with indestructable walls seems far from computationally or bandwidth intensive.

Of course, this doesn't answer the original question of how to have such effects on a networked game. Basically, I don't think anyone has a clear answer at this point. However, I agree with nonlnear on the point that it's mainly up to game developers to figure it out. Ageia may provide some hardware and a physics middleware solution, but it's up to developers to implement it.
 
Spoudazo said:
I don't think this is a PPU problem, we just need our ISP and network manufact. to stop being so slow and lazy and up the bandwidth already. 512 up and 5+ down isn't going to cut it.

If you what Diggnation, the podcast of Kevin Rose and his friend, they mentioned how millions of dollars were "invested" in ensuring we would have about 50mb/second by 2005 or 2006, well, and now we're NO WHERE close.

So this just puts the bottleneck on the ISPs, it's that simple.

Well it all comes down to who has the most cubic dollars, I suppose. My ISP is upgrading to 900MB using wireless backhaul, and its $60,000 *just* for the dang wireless hardware. There running everything off T3 right now, so its going to be a huge change, and one huge internet bill.

But think of australia :p. From what people've told me, I just about laugh my ass off at their internet prices. I live out in the sticks, as you might put it. Mountain valley, nothing really around, and I 1.5Mb dsl is ~35 a month. For those that are outside of the 5 mile reach of wired DSL, wireless is avalible for 39.00 a month, 3mb down, 1 up.

I also looked into getting a dedicated hosted server, I cant beleive the prices some people want for 5Mb up/down interweb :eek:. Hopefully verison's deep pockets can push the FIOS setup to more people. Im having wet dreams over that.
 
It is not necessary for an accurate physics simulation to pass data from 1000 falling boxes over the network.

The idea behind the physics processor is to show collisions in more detail as they occur from a single starting point, say a ball hitting a wall. On a host machine with a physics processor, the action begins a cascade effect of the wall collapsing brick by brick.

The only information that should be sent to a remote machine via the network is the velocity and angle of the ball hitting the wall. On the remote client - and assuming the same physics processor is installed - the end result is the same collapsing cascade, brick by brick. It is not necessary to transmit each brick's position over the network.

Without a physics processor, the collapse occurs in less detail but should still occur.

The scenario I assume you're discussing is tracking the effect of bricks from the collapse, say, hitting a particular player local or remote. Someone playing in the environment using a physics processor has an advantage in which more bricks are visible and can therefore be avoided. A remote client without a physics processor, however, should still take damage based on the proximity to the collapsing wall but there will be less visual feedback to help avoid the falling bricks.

In neither scenario is it necessary to transmit data about the status of specific falling or collapsing objects.

I think longterm a physics solution that occurs directly on the graphics card is superior to using a card on a separate PCI bus, but we'll see where the market takes us.
 
while im too lazy too read everyones responses, it would makes sense to me that the server would only send a packet to your system telling it what object (can, body, barrel, vehicle, ect.) was hit and then allow your PPU (if you have one) to calculate what would happen too it..
 
^^^
Oh, that's all? Right now, the server only has to do that for say, a dozen, maybe 2, objects in the entire level. Aegia states the current-gen PPU can handle thousands(5000 seems to be the magic number, but I don't know where it came from) The server must keep track of the location, linear and angular velocity of every collidable(read: interactable) object in the game level. This means every piece of shrapnel, every body part, every bullet, et al. Bandwidth isn't the *biggest* concern over this, what is is whether or not a server with a single current-gen PPU card will be able to handle implementing something like what is seen in cellfactor over a large level with 12-16 players on it. If one of the demo levels from cellfactor is 1/8th the size of one of the larger maps from RA3, but contains thousands of objects, will games using multiplayer maps 8x the size have 8x the objects, or will they be thinned out? For the client, this doesn't matter since the boxes, fluids, smoke etc on the other side of the map or behind a wall doesn't have to be calculated by your PPU. For the server, this matters. How much? I don't know. Neither does anyone else. I'm asking, not telling.
 
Back
Top