This works well for the current physics model in regard to compressibility over the network. But when future games crank up the real physics - stuff like deformable interactive enviroments, interdependent physics, and ongoing effects - it'll become a much more complicated issue. For example, cascading physics such as realistic earthquakes would require massive amounts of processing server-side - it's also necessary to transmit a lot of data to the clients since the earthquake affects how the game is played (and you can't just send outcome data because what happens during the earthquake matters too). While the server and clients will only transmit data required for gameplay, it may become unavoidable to have to send data about thousands of objects across the network anyway.zooraf said:The scenario I assume you're discussing is tracking the effect of bricks from the collapse, say, hitting a particular player local or remote. Someone playing in the environment using a physics processor has an advantage in which more bricks are visible and can therefore be avoided. A remote client without a physics processor, however, should still take damage based on the proximity to the collapsing wall but there will be less visual feedback to help avoid the falling bricks.
In neither scenario is it necessary to transmit data about the status of specific falling or collapsing objects.
Of course, whether the server can even handle the load is another matter. Maybe they will by the time 'realistic' physics becomes mainstream? Who knows when that will be...