AEGIA is going to be rich

the cost is because of all the damn ram they put on it. I wish nvidia or ATI just make it like a math co-processor ala 386 DX. And then place 512 of gddr. Lets face it going from 256 to 512 isnt going to boost you that much. Plus they will get the added speed of the PPU placed right next to the GPU. I think an added 50 bucks to my vid card is better then paying 300 bucks for

Is that a molex on the card on the front page? they are not wasting time about grabbing power hehe :eek:
 
Sly said:
how is the one without the PPU supposed to handle it?

Well if they become very popular, which is looking the way of things, they'd need a less fancier software physics and then the fancier PPU physics running in support for a game such as you described. Only way to keep things evenly without requiring a certain peice of hardware. I assume they hope they get standardized as fast as possible though to become as common as a graphics accelerator.

Pharacon said:
the cost is because of all the damn ram they put on it. I wish nvidia or ATI just make it like a math co-processor ala 386 DX. And then place 512 of gddr. Lets face it going from 256 to 512 isnt going to boost you that much. Plus they will get the added speed of the PPU placed right next to the GPU. I think an added 50 bucks to my vid card is better then paying 300 bucks for

Is that a molex on the card on the front page? they are not wasting time about grabbing power hehe :eek:

For anyone who wants the link:

http://www.hardocp.com/image.html?image=MTExNjYwMTUzNjAyN1RkMWczNlNfMV8xX2wuanBn


Noticed that about the 12V as well. No time wasted at all ;)

All i can think about the PCI 1x connector on the other side is that the board can host both, in order to save on production costs they made 1 universal board and then move on to what format they want to design on from there. Should help on production i should think.
 
Hahahaa, another freaking chip to watercool, so it makes less noise?


I'll wait this one out, like I waited out first Voodoo's hehehe. When game won't launch without one of these, i'll buy it, but till then, meh.
 
FanATIc said:
Well if they become very popular, which is looking the way of things, they'd need a less fancier software physics and then the fancier PPU physics running in support for a game such as you described. Only way to keep things evenly without requiring a certain peice of hardware. I assume they hope they get standardized as fast as possible though to become as common as a graphics accelerator.

If possible, maybe we'll get a DirectX extension for hardware physics too. :)
 
cell_491 said:
...it has to do with creating life like motion and natural physics. for example if you shoot a wall in a current game all that happens is a few flat hole decals appear and later disapear. well with the ppu and games designed for it youll shoot the wall it will realistically flake the paint back as well as chip the rock or concrete and send shrapnel throughout the room

I'm keeping my fingers crossed. Don't you just hate the fact that no matter how many rockets you launch, grenades you toss, or bombs you set off, the environment manages stays in tact? My personal favorite are the aformentioned self repairing bullet holes.
 
i have that thermaltake heatsink on my old geforce 4. quite possibly the loudest cooling solution ever, once the fan gets some dust in it. You have been warned...

i watched the videos...in theory this sounds awesome, but I think it has also the potential to be a huge scam or vaporware product. we'll see


Also, with the advent of duel core processors and 64 bit extensions, wouldn't it possibly be easier to start to program games to take advantage of these instead of having to have an extra card in the system? Or is there something specifically this does that your processor can't? Both would require coding games from the ground up differently...but that's a paradigm that has needed to change for a long time anyways.
 
Sly said:
How exactly is Havok integrated in a game? function calls? API calls? I imagine it would be implemented the same way. How about if Havok was hardware based using the same function calls as the software one and the AEGIA hardware can be accessed the same way?

But while this may be beneficial for the eyecandy of single player games. Just how would this affect multiplayer ones? You'll basically have different things showing on a PC with the card and a PC without it. You blow a building to bits, how is the one without the PPU supposed to handle it? Would it skip the fancy physics altogether? While the one with the PPU has his path and vision blocked by the debrie and smoke, it wouldn't be showing up on the guy without the PPU and have a clear shot through the hole while you're still squinting through the smoke.

I would imagine that the pc without the PPU would render the hole and explosion minus all the fancy effects that the PPU would provide. The guy without the PPU would get the bang, the flash of light, some cheesy smoke effect and oh lookie there, a hole in the wall. The guy with the PPU would see all sorts of debris flying along with portions of the wall crumbling to expose the hole.

-
 
tesfaye said:
I'm keeping my fingers crossed. Don't you just hate the fact that no matter how many rockets you launch, grenades you toss, or bombs you set off, the environment manages stays in tact? My personal favorite are the aformentioned self repairing bullet holes.

Red Faction? :D I actually had fun tunneling around the demo map and dropping on the glass house from the ceiling :D Caught in the middle of the open in a crossfire? No prob! Drop a sachel and BANG, instant foxhole! Trapped in a room with one door? Blow another one! It'll be cool if Unreal3 can do those too!
 
So...

I'm sitting here thinking...

Dual GPU's, Dual Core CPUs...

*looks at all the empty pci slots on his mobo*

Can I buy 4 $100 cards and slap them in there, and would I get any benefit out of it? I mean One PPU can boost the viewing scene from 2,000 objects alone to 32,000... so would four working in conjunction give me in the range of 122,000 objects? How many more times in magnatude than HL2 can we actually achieve with something like this?

And we haven't even talked about different clock speeds or overclocking it yet.

food for thought.
 
JasperW said:
i have that thermaltake heatsink on my old geforce 4. quite possibly the loudest cooling solution ever, once the fan gets some dust in it. You have been warned...

i watched the videos...in theory this sounds awesome, but I think it has also the potential to be a huge scam or vaporware product. we'll see


Also, with the advent of duel core processors and 64 bit extensions, wouldn't it possibly be easier to start to program games to take advantage of these instead of having to have an extra card in the system? Or is there something specifically this does that your processor can't? Both would require coding games from the ground up differently...but that's a paradigm that has needed to change for a long time anyways.

Its not that your current cpu cant do it, it can to a certain extent, but its just like with graphics acceleration on a 3d card, the cpu is programmed to do anything thrown at it at the same speed, but when you have a chip programmed to do something specifically not just everything it can be designed to do these things much faster than a general processor that does anything and everything.

Also I think that putting these on graphics cards is the worst idea ive ever heard. Why is my first question. You would steal potential bandwidth and power from your graphics card yet it would still cost you the same ammount as the seperate card would cost but you just added the price to your graphics card making your $500 card a $800 card. If the memory is already being used its already being used you cant just magically use it for something else, you'd still have to have the extra ram for the ppu and the gpu.
 
Sly said:
Red Faction? :D

You know, I never got around to playing that game. I heard so many bad things about it that I stayed away. :-(


Sly said:
It'll be cool if Unreal3 can do those too!

You're damn right it would!

I hope this works out and the product is solid and not some buggy overhyped piece of crap. Those videos that Brent linked to in an earlier post are very impressive. Damn, I can just see another $1200 leaping out of my pocket next year on a new video card, A64 X2 processor, Creative X-fi sound card (if they turn out to be all that they are cracked up to be and possibly some more ram.
 
pain.angel said:
Also I think that putting these on graphics cards is the worst idea ive ever heard. Why is my first question. You would steal potential bandwidth and power from your graphics card yet it would still cost you the same ammount as the seperate card would cost but you just added the price to your graphics card making your $500 card a $800 card. If the memory is already being used its already being used you cant just magically use it for something else, you'd still have to have the extra ram for the ppu and the gpu.

hrm did i say that? or is it in response to someone else?

Back in the days of Packard Hell circa 1996ish, integrated peripherals were terrible. They used more resources, caused more problems, and generally made a system slower. Today integrated peripherals are becoming more common, even standard. Most motherboards come with extra storage controllers, ethernet controllers, audio, all built in. If this became a defacto standard, i don't see it as impossible that we wouldn't see it integrated into other hardware like video cards or motherboards.

However, I do agree that initially, it would be a bad idea just from the cost aspect. I don't think there's much of a bandwidth issue on the x16 pci-express bus.

Also, yeah specialized processing can be amazing! I remember my old 4 meg righteous orchid 3d, that thing played quake awesome :) Blew my mind at the time....so i definatly believe this _can_ work, but at the same time processors are scaling up so much I think with dual cores it could probably do it pretty good too. Time will tell! Let's just hope that developers start coding games multi-threaded in general, whether it's for a specialized PPU or a second processor.
 
ColinR said:
In that pic on the front page of [H], is that a PCIe (x1) connector at the "top" or some diag connector? That might bring the price down - a combined PCI and PCIe card. Just turn it upside down and move the securing bracket. It doesn't look quite right in the pic, as it doesn't have the screw holes in the right place.

That definately looks like a PCIe connector, and that definately looks to be the case
 
A year or so ago, we had everything onboard...sound, RAID, ethernet, etc...

Now, with all the new SLI boards, some boards, even hi-performance models from the likes of Asus, are only sporting 3-4 free PCI slots, (compared to the five or six we had before) two or so are (as of now) useless PCI-E x1 slots.

Here's hoping all this cool new next-gen physics stuff will FINALLY be an excuse to use those x1 slots we all have vacant now.

Any word on whether Creative's new sound card will have a x1 version?

Can anybody actually link me to an x1 product that actually exists?
 
That card is just an engineering sample to show that it can work with both PCI and PCI-Express (which that one does). The actual retail cards will have one or the other, not both.
 
GoldenTiger said:
That card is just an engineering sample to show that it can work with both PCI and PCI-Express (which that one does). The actual retail cards will have one or the other, not both.
its defnitely an engineering sample because the HSF is made for a geforce 4 video card...it says it right on the front of it
 
So with this PPU a single system will have three main processors: a scalar one for integer and floating point math, a vector processor with sometimes questionable internal precision and a specialized vector (?) floating point processor.

CPU, GPU and PPU. I wonder what ever happened to the 'central' in CPU? Perhaps it's time to either rework or redefine CPUs?
 
it'd definitely be cool if the retail versions had both pci and pci-e connectors like that though....

the problem is it'd be far too convenient, so we'll never see it :(
 
"You blow a building to bits, how is the one without the PPU supposed to handle it?"

just like they do it now days.

if you dont have the hardware to run it you get a substitution.

example .. fire looking weird on old geforce 4 cards in doom 3 vs my 6800 card.
 
cell_491 said:
its defnitely an engineering sample because the HSF is made for a geforce 4 video card...it says it right on the front of it

yeah, I kinda thought "THIS IS AN EVAL/DEMO BOARD" would have perhaps gave that away... :p
 
Elledan said:
So with this PPU a single system will have three main processors: a scalar one for integer and floating point math, a vector processor with sometimes questionable internal precision and a specialized vector (?) floating point processor.

CPU, GPU and PPU. I wonder what ever happened to the 'central' in CPU? Perhaps it's time to either rework or redefine CPUs?

CPUs are great and all, but specialized processors are so much more powerful

IMO have one processor to be the bridge between all the other specialized processors in a system
 
Brent_Justice said:
CPUs are great and all, but specialized processors are so much more powerful

IMO have one processor to be the bridge between all the other specialized processors in a system


Well isnt that just because of that reason? The GPU and PPU are specialized and have a limited and more predictable duty so obviously a cache isnt needed nor speeds anywhere near a modern central processor. Effectively still the thing in everyones computer that does the broadest amount of work is still the CPU. (thats not to you but to whome you quoted) so i dont think renaming them is in need and i dont think their performance should be questioned :).
 
Elledan said:
So with this PPU a single system will have three main processors: a scalar one for integer and floating point math, a vector processor with sometimes questionable internal precision and a specialized vector (?) floating point processor.

CPU, GPU and PPU. I wonder what ever happened to the 'central' in CPU? Perhaps it's time to either rework or redefine CPUs?
You forgot the APU...*cough* X-Fi *cough*... ;)
 
Brent_Justice said:
CPUs are great and all, but specialized processors are so much more powerful

IMO have one processor to be the bridge between all the other specialized processors in a system
Hrm... it just seems to me that things have become quite fragmented since the early 90s. If you remember the Amiga, it relied solely on specialized processors for pretty much everything, with indeed what might be called the 'CPU' (to use PC-terms) to act as a bridge between those PUs.

Right now PCs already more or less rely on soundcards, GPUs for reasonable graphics (beyond a simple terminal interface -> serial port), RAID chips, etc., to the point where things are beginning to seriously overlap.

In other words, where is the need for a generic PU (int, fp) if specialized PUs are so much more powerful? Is it only about price?

When I look at the Cell PU, I see 8 vector PUs with a single scalar PU acting as a bridge/controller. This could be considered the perfect PU for many tasks (gaming, video, physics), whereas the rest would do better with other specialized units.

Is the future of the PC the Amiga? ;)
 
tesfaye said:
You know, I never got around to playing that game. I heard so many bad things about it that I stayed away. :-(

The one you're talking about is probably Red Faction 2, and it has absolutely nothing to do with part 1! No terrain deformation, different weapons (No XRAY vision!), different story, different planet!

Red Faction was the first game on the PC that came out that had true terrain deformation as a key part of the game. You can drive a vehicle through a wall, punch out the ceiling around the edges and have the roof fall in one piece, create a long tunnel in an incline to make a shortcut. Atleast it gave me a good picture on what these people are describing (Guy hides behind a wall, blow the wall and him away with a rocket launcher).

But i honestly, still don't know how a PPU would be able to help here :confused: There's no physics involved in terrain deformation except at the moment of impact, the game just figures out the blast area, generates some smoke, chops the affected sections into smaller pieces depending on the weapon, then makes those pieces tumble down. Before your CPU starts struggling with the physics, your videocard would be struggling with the new polygons and transparencies first. Notice how the link showed SLIed 6800's? It takes that much GPU power to render the huge number of particles tracked by the PPU. If you were to limit the particle count to what a 5900 or a 9800 can do, you can animate those particles with just your normal CPU based physics with little penalty. And once they turn static (no longer moving) the game usually makes them disappear to free up polygons to give your video card some breathing room. Perhaps the PPU would be best implemented with video cards two more generations down the road?

I'm currently downloading the big 18MB AVI i found on the link (I'm on dial-up :( ). Maybe that'll give me an idea on how it can be properly used.


EDIT:
I just rememebered a certain situation in a current game. Doom3. I remember making a couple of dozen spider drones, then killing all of them, turning them into rag dolls. With two dozen dead drones in one pile i can circle it with no problem at decent framerate. When i tried throwing a grenade. The moment they started bouncing, my framerate dropped to less than 1 fps! When they turned static, my framerate went back to normal.

Maybe a PPU would have helped here?
 
Getting back to what somebody else said, why can't a dual core CPU do the job of a PPU? I thought the whole point of programming for multi-threaded games was offloading certain calculations (I.e. physics) onto different cores/cpus. In other words, if a 2.4ghz 3400+ handles games perfectly fine today, why can't a dual core 2.4ghz handle the same game with increased physics, since, in theory, there will be a whole other 2.4ghz cpu available to offload physics calculations onto?
 
jebo_4jc said:
Getting back to what somebody else said, why can't a dual core CPU do the job of a PPU? I thought the whole point of programming for multi-threaded games was offloading certain calculations (I.e. physics) onto different cores/cpus. In other words, if a 2.4ghz 3400+ handles games perfectly fine today, why can't a dual core 2.4ghz handle the same game with increased physics, since, in theory, there will be a whole other 2.4ghz cpu available to offload physics calculations onto?
dude what you not understanding here is that a cpu is not designed to process physics at this level this card is specifically designed for processing physics...this card is to physics as a video card is to graphics and we all know that no matter how fast a cpu is it will NEVER produce graphics like a standalone card designed especially for that function
 
cell_491 said:
dude what you not understanding here is that a cpu is not designed to process physics at this level this card is specifically designed for processing physics...this card is to physics as a video card is to graphics and we all know that no matter how fast a cpu is it will NEVER produce graphics like a standalone card designed especially for that function


i think the point is that a graphics card is rendering textures, doing dx effects etc. A PPU is just doing craploads of math really, isn't it? It offloads how to draw everything to the video card, it is there to tell it where it needs to be drawn at. Seems like a CPU is good at doing lots of math. What's next, a specialized AI chip? Developers are complaining a lot lately about how much of a performance hit AI is. Maybe I don't understand exactly what the card is doing? (actually, definetly :) ) I would like to learn more though.

To the guy making amiga references, you brought a lot back for me there :) I had an amiga 3000 with SIX megs of video ram, it was a BEAST. You can still edit video on it with Video Toaster! Maybe AMIGA was the way...
 
a CPU can do the same things as the PPU, but the PPU is much faster at it since it's specialized
 
Brent_Justice said:
a CPU can do the same things as the PPU, but the PPU is much faster at it since it's specialized
thats sort of what im trying to get at. The cpu can do what the ppu can do but not as fast and if there was one made to do the same things as the ppu it may be slower in other things that are important to a pc...personally i think this is a HUGE step forward for pc's
 
Man PPUs are going to be awsome, that short film with the plane crashing into the boxes ans oil drums in which everything (including that crane) moved so realisticly was amazing.
 
OK to those that arent understanding why a specialized card isnt better. Think about it like this.

CPU has to do alot more work. It's always doing something. You want to throw even more work at it instead of cutting it a break? Sure it CAN do it. But it's never going to go as fast as a card that does nothing but perform one function.

Take Mpeg2 decoders back in the day. When DVD came to pc's, you either had to have a super powerfull computer to play smoothly, or get a Mpeg2 hardware decoder. If you went the CPU route the DVD would lag your system to no end. Now on a 400-500mhz machine playing a dvd it would work. But dont expect to be able to do anything else on the machine. Place a hardware Mpeg2 decoder and suddenly the machine is workable again as the CPU is no longer trying to decode the mpeg stream.
 
You know.. it soudns great.. but theirs no way in hell i'm getting ANOTHER card that wants a molex connector.. my psu cant handle that nad i'm not about to spend MORE money on yet another PSU for something that may or may not take off... call me back in a years time maybe.
 
seithon said:
You know.. it soudns great.. but theirs no way in hell i'm getting ANOTHER card that wants a molex connector.. my psu cant handle that nad i'm not about to spend MORE money on yet another PSU for something that may or may not take off... call me back in a years time maybe.
i doubt the retail cards will require a molex connector...anything that can be cooled by a heatsink that small cant draw that much wattage
 
True.. if it doesnt require a molex, and is pcie, and fairly cheeply priced i'll probably get one, it would be interesting if their engin could work part onbaord and part on cpu, so even if you got a lower end card the cpu coudl make up for some of it
 
If this thing ends up costing between $249 - $299 it will be dead in the water. Hardly anyone will pay that much.
 
almostinsane1 said:
If this thing ends up costing between $249 - $299 it will be dead in the water. Hardly anyone will pay that much.
it wont just be one card itll be a series with a highend and a low end...personally ill be buying the low end until there is a significant amount of games that require the higher end model
 
Back
Top