Ageia!

Perhaps this belongs in the Blender forums themselves, but I’ve been curious as to Ageia support. Please, don’t use this topic to say “Ageia sucks”. I don’t care about your opinion of it.

I recall reading on Blendernation how it won’t be implemented, but dsince the Ageia SDK is free… think it could happen for use in games, or maybe even for simulations outside of that?

It doesn’t suck, but I don’t see it as necessary anymore. Especially since Nvidia already has a dedicated physics layer on their new 8 series hardware.

As for weather or not the Ageia SDK can be incorporated: I say sure, anything is possible.

Although, I don’t see it as something that is likelly to happen with blender.

The problem I see with adding physics to graphics card is that it steals power I’d rather have go to the graphics, and not physics. That’s why I bought Ageia ;p Anyways, let’s not turn this into a debate about the usefulness of Ageia, please.

The engineers who worked on the 8 series accounted for all of that. Any drop in framerate (if any) caused by the accelerated physics calculations is negligable, since you will never notice it.

We are talking about a card that can easily push beyond 100fps on high settings. The 5-13 frames that physics push out really don’t matter.

Ageia provides great performance, there is no doubt, but it’s the kind of performance that neither of todays top games really need, and by the time that they do, Nvidia and ATI will already have their physics layer technologies so well established that it won’t even matter anymore.

Please stop trying to shield the Ageia franchise from valid critique with the “I don’t want a debate” excuse. Ignoring the facts never helped out anyone. You should realistically consider the possibility that the Ageia hardware you bought was a bad investment.

I used to contribute some graphics to VegaStrike (I was the local Blender evangelist, too), and I recall that the main developer and head cheese, Daniel Horn, published one of the first papers on using the GPU for fluid simulations – long before “CUDA” came into the scene.

Perhaps this “GPU for physics” aspect is worth looking into, as GPGPU computation is available to anyone with a video card capable of running fragment shaders.

One thing I’ll say about the AGEIA: this baby ain’t exactly cheap, even bundled. I’d wait to see which tendency catches on before investing.

Bullet Physics is open source, multiplatform and being improved all the time. See new features for Blender 2.43:
http://www.continuousphysics.com/Bullet/phpBB2/viewtopic.php?t=752
For the future, there are plans for SIMD, multi-core and GPU optimizations for Bullet.

I don’t see any future for this Ageia PPU chip, its way slower then GPU and even slower then many CPU’s.

More importantly, we can’t have a closed source engine that doesn’t compile on all platforms. And the problem of adding Ageia support is that it takes away users from Bullet Physics. And that is really bad in my opinion

Please support Bullet and other open source, that fits more with Blender.
Thanks,
Erwin

Quote from Erwin himself from the Blender.org forums.

ATI and Nvidia are incorperating PPU’s into their GPU’s so you won’t have to buy an entirely new card.