Nvidia acquires PhysX!

Feb 6, 2008 at 3:11 PM
Here the realse taken form gamedev.net:

NVIDIA, the world leader in visual computing technologies and the inventor of the GPU, today announced that it has signed a definitive agreement to acquire AGEIA Technologies, Inc., the industry leader in gaming physics technology. AGEIA's PhysX software is widely adopted with more than 140 PhysX-based games shipping or in development on Sony Playstation3, Microsoft XBOX 360, Nintendo Wii and Gaming PCs. AGEIA physics software is pervasive with over 10,000 registered and active users of the PhysX SDK.

http://www.nvidia.com/object/io_1202161567170.html

ANy thoughts/opinions? Not just on its relation to us! Im excited, it leads to the possibility of SLI aiding physics. No more expensive PPU, just use the GPU!
Feb 6, 2008 at 3:26 PM
Yeah, I saw that yesterday. Quite interesting, especially after Intel acquired Havok.

I think it's great news, as long as the PhysX software remains free. And under nVidia, this is surely going to be the case. All of nVidia's dev tools are free.
Coordinator
Feb 6, 2008 at 3:29 PM
I don't know how much money it will save, if they use a GPU they'll still need another one, I wouldn't want my framerate dropping because physics calculations were hogging my GPU. They might be able to build physics stuff onto graphics cards, if they could squeeze it onto the card without making a huge card, but I'm not sure if the PCI connection from a single slot could transfer that much info (I don't know enough about PCI bandwidth).
Feb 6, 2008 at 3:39 PM
Check this out (may need to sign into Live then re-click link): http://members.microsoft.com/careers/search/details.aspx?JobID=6b94ac4f-0627-4851-8e6a-633186d96261

Coordinator
Feb 6, 2008 at 3:54 PM
Microsoft....taking over the world one piece at a time.
Feb 6, 2008 at 5:38 PM
Mike, i do not think that they will move physics to the GPU, the GPU is going to be pressed to it's limit without any physics in the coming years. I could imagine that it's all moving to into one card.

Shaw, that sounds like a perfect job for you :)

LordIkon the slogan from the DevDiv is "Changing the world one developer at the time", I don't think they are taking over the world, just changing it to look a certain way :)
Feb 6, 2008 at 8:51 PM
I think it'll be more like GPUs becoming general-purpose vector processors. Basically, an easier GPGPU. I wouldn't be surprised if something like CUDA found its way into the mainstream over the next few years. Hell, the G80's are practically 128 core processors. They are just extremely simple cores that can't do much more than floating-point math and simple branching. Of course it can also do SIMD across cores, maxing at around 518 GFLOPS total [Nickolls, GH 2007]. By comparison, a quad-core Xeon 5300 will max around 60 GFLOPS. What makes graphics processing so damn fast is how "embarrasingly parallel" it is. Physics processing is still highly parallel, but harder to do right than graphics. But it's still mainly vector math and would perform very well on vector hardware.

Too bad the job posting is dated 2005 :)

Sure they're taking over the world. They just know how to do it. Train the current/next generation of developers in Microsoft proprietary tools and you'll force greater market shares in 10 years.
Feb 7, 2008 at 3:18 PM
Strum, I disagree. 3 Way SLI has recently been rleased by NVidia, do I see an overall scheme? 3-Way SLi was touted a while as being 1 for the physics and 2 for graphics. And can you honestly think of a game that cant be run well on two 8800 Ultra's? (Extreme example I know, but still relevant). Plus, many games I play are CPU, not GPU limited, even on my lowly 7600gt. And not only games. Engineering sims (eg Autocad) could offlaod the porcessor to the gpu, were otherwise its not needed. Plus PCI Express 2.0 should have plenty of bandwidht.
Coordinator
Feb 7, 2008 at 5:14 PM
Who is Strum? :oP
Feb 7, 2008 at 5:17 PM
Well as we move forward we will def see the need for better HW than we have today, instead of 1024x1024 terrains you will have 10000X10000 and multiple runtime deformances. Models will use 20M or even 200M polys etc. The reason we do not see these today is mostly due to business requirements, take WoW for one, they use a low poly rendering engine with low poly models. The reason is simply that they want to support as many platforms as possible, making a bigger community.Crysis on the other hand requires a lot more from the gfx but isn't really as hard on the cpu. And there are projects which push even nVidia sli settings over the limit, just to be able to render better environments.

Feb 7, 2008 at 5:44 PM

Sturm wrote:
Crysis on the other hand requires a lot more from the gfx but isn't really as hard on the cpu.


Are you sure about that? Crysis is definitely GPU bound, but my quad core is pushed pretty hard by that game.
Feb 7, 2008 at 7:00 PM
Well depends on how you compare, on my old x700 I can only play that game in the low setting, going any higher on the gfx settings breaks the game (makes it unplayable) same machine with a more powerful gpu allows me to go higher. But yes there is a cpu cost as well, which means I have to upgrade sometime in the not too distance future.
Coordinator
Feb 9, 2008 at 3:50 AM
I'll have to agree that Crysis pushes my gpu more than my cpu, or so says my temp monitors. Of course, that is only for the first minute or two until it crashes. Can't even make it more than 2 minutes into the very first level.....so far waste of $50. I'm also surprised my GPU fan is going full blast even at low settings like 1024x768 and a mix of medium and low settings.
Feb 9, 2008 at 5:59 AM
Crysis is GPU bound, yes. That wasn't my point. All I was trying to say is that Crysis also likes to use up CPU time.
Coordinator
Feb 10, 2008 at 3:15 AM
As good as the game is looking, the gameplay bugs/issues are really starting to bother me. Twice during play my guy has literally just fallen over dead. Both times there were no enemies in range, in fact, one time I stepped into the VTOL because two ally NPCs to bumping into each other repeatedly (AI bug), and then I just fell over dead.

I may have to revert back about an hour into my game, because I was attempting to shoot down a helicopter, and it decided to fly up about 2,000-3,000 ft into the sky, and NEVER come back down. Normally that isn't a big deal, however the goal for this mission is to secure the harbor, and that means killing a helicopter that will never come back down. Pretty rediculous stuff.
Coordinator
Feb 10, 2008 at 3:17 AM
Edited Feb 10, 2008 at 3:27 AM
Apparently this is a known bug, I'm glad EA has decided not to patch this up....looks like I get to revert back about 90 minutes to when the helicopter is in range to shoot.

http://www.youtube.com/watch?v=GocomZecDeE
Feb 10, 2008 at 4:16 AM
Interesting, I never encountered that. The game is pretty mediocre, but the engine technology is incredible. Especially the Sandbox tool. That took just kicks all sorts of ass.

Don't get me started on EA. They ruin almost everything they touch. The newest Need for Speed is the biggest pile of crap racing game ever. Crytek would have been better off with a different publisher.