Send News. Want a reply? Read this. More in the FAQ.   News Forum - All Forums - Mobile - PDA - RSS Headlines  RSS Headlines   Twitter  Twitter
Customize
User Settings
Styles:
LAN Parties
Upcoming one-time events:
Greenbelt, MD 08/22

Regularly scheduled events

NVIDIA Responds to ATI PhysX Comments

PC Games Hardware has a response from NVIDIA to a recent statement by ATI that developers other than Epic use PhysX for physics "because they’re paid to do it." NVIDIA's Nadeem Mohammed, Director of Product Management, PhysX, who says: "we do not pay developers to select PhysX instead of other physics solution." He also states that PhysX is not proprietary, even though it obviously is, saying: "PhysX is a complete Physics solution which runs on all major platforms like PS3, XBOX360, Wii, PC with Intel or AMD CPU, and on the PC with GeForce cards; it even runs on iPhone. It's available for use by any developer for inclusion in games for any platform - all free of license fees. There's nothing restrictive or proprietary about that."

View
52 Replies. 3 pages. Viewing page 1.
< Newer [ 1 2 3 ] Older >

52. Re: NVIDIA Responds to ATI PhysX Comments Mar 16, 2010, 12:49 Verno
 
It's hard to believe people are fighting over Intel bringing out non-integrated graphics Their integrated solutions haven't exactly been stellar performers for anything but the extreme low end. I doubt Nvidia and AMD have anything to worry about other than pricing due to a potential third market entity.  
Avatar 51617
 
Playing: Infamous Second Son
Watching: Midsomer Murders, Dominion, The Knick
Reply Quote Edit Delete Report
 
51. Re: NVIDIA Responds to ATI PhysX Comments Mar 14, 2010, 14:54 Ant
 
MTechnik wrote on Mar 14, 2010, 10:25:
^Drag0n^ wrote on Mar 14, 2010, 01:51:
Personally, I hope it does make it...We need more shakeups in the graphics side of PCs. Too few high-end players.

^D^
It was a sad day when Matrox dropped out and let it be a 2-horse race.
It will be sad if AMD drops ATI video cards. Yeah, I miss Matrox too since I used its G400 card.

I use ATI now because NVIDIA dropped its fullscreen video overlays to TVs a few years ago and their newer cards don't support it at all.
 
Avatar 1957
 
Ant @ The Ant Farm: http://antfarm.ma.cx and Ant's Quality Foraged Links: http://aqfl.net ...
Reply Quote Edit Delete Report
 
50. Re: NVIDIA Responds to ATI PhysX Comments Mar 14, 2010, 10:25 MTechnik
 
^Drag0n^ wrote on Mar 14, 2010, 01:51:
Personally, I hope it does make it...We need more shakeups in the graphics side of PCs. Too few high-end players.

^D^
It was a sad day when Matrox dropped out and let it be a 2-horse race.
 
Avatar 23463
 
Reply Quote Edit Delete Report
 
49. Re: NVIDIA Responds to ATI PhysX Comments Mar 14, 2010, 08:15 ASJD
 
wtf_man wrote on Mar 13, 2010, 11:20:
ATi came out with Tessellation way back with the DX8 8000 series... it was called Truform. Very few games supported it. We now have DX11 and it's the "new rage" for game developers... why? Because now it's a standard in the DX11 specs. Since ATi already had a tessellation engine... it helped them get DX11 card first to market.

DX11 isn't a rage for anyone yet, probably won't be for quite a while, and won't ever be unless the next gen of consoles are DX11 (but not DX12+).

I still haven't seen anything overly spectacular with Hardware Accelerated Physics. Extra particles... whooptiedoo. The non-hardware accelerated physics from old games like the original Far Cry or Oblivion seem to be good enough.

The ripping / tearing cloth in Mirror's Edge was pretty hot. But again, the new standard of mainstream development isn't DirectX inclusion, it's being able to run it on consoles, period.
 
Reply Quote Edit Delete Report
 
48. Re: NVIDIA Responds to ATI PhysX Comments Mar 14, 2010, 01:51 ^Drag0n^
 
Intel's last response says it all.

http://www.pcgameshardware.com/aid,701071/Larrabee-2-is-still-possible/News/

Last I heard, Mike Abrash was spearheading a lot of the Larrabee work. Given that, I doubt it is dead, but given the fact that the first silicon was rejected as a consumer product, I'd guess it won't be on desktops anytime soon.

Personally, I hope it does make it...We need more shakeups in the graphics side of PCs. Too few high-end players.

^D^
 
Avatar 55075
 
"Never start a fight, but always finish it."
Reply Quote Edit Delete Report
 
47. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 21:46 The PC Warrior
 
dumb shit selectively quoted from articles out of context and ignoring the many articles that claim the contrary.

All you have is vaguely repeated PR quotes where you're selectively using your own interpretation. Intel's own website has ZERO on Larrabee as a discrete graphics solution, shelved and buried. The only site up is Larrabee as a software platform. You can keep clinging to the belief that they will be unveiling something this year if that's what floats your boat.

When Intel announces a new discrete graphics solution then we'll examine it. Until then nothing is on record with anyone that Larrabee is alive as anything but a software platform, no matter how much capslock you use. You can call me names all you want, all you have is some save face PR quotes after Intel blew a billion in R&D and had to call it a day with Larrabee. End of story, period.
 
Reply Quote Edit Delete Report
 
46. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 20:20 Beamer
 
Oh good lord you're making my brain hurt.

Will all caps make you understand?
READ THE CNET QUOTE:
""Larrabee silicon and software development are behind where we hoped to be at this point in the project," Intel spokesman Nick Knupffer said Friday. "As a result, our first Larrabee product will not be launched as a standalone discrete graphics product," he said. "

FIRST PRODUCT. THEY HAVE SHELVED THE FIRST PRODUCT. THEY HAVE NOT KILLED THE TECHNOLOGY. THEY HAVE NOT ENDED THE PRODUCT LINE. I SWEAR TO GOD YOU MUST UNDERSTAND THIS AND ARE SIMPLY TROLLING. EVERY SINGLE LINK I POSTED SPELLS THIS OUT. AND, CONTRARY TO WHAT YOU'VE SAID, THAT IS THE FIRST ATTACK ON YOU.

HERE WAS FROM THE ANANDTECH LINK:
"As of today, the first Larrabee chip’s retail release has been canceled. "

NOTE THE WORD "FIRST" THERE, IMPLYING THAT PLANS FOR THE SECOND AND THIRD RETAIL RELEASES HAVE NOT YET BEEN CHANGED.
HERE IS MORE FROM ANANDTECH:
"Next, this brings us to the future of Larrabee. Larrabee Prime may be canceled, but the Larrabee project is not. As Intel puts it, Larrabee is a “complex multi-year project” and development will be continuing. Intel still wants a piece of the HPC/GPGPU pie (least NVIDIA and AMD get it all to themselves) and they still want in to the video card space given the collision between those markets. For Intel, their plans have just been delayed."

NOTE THE LAST LINE. THE PLANS HAVE NOT BEEN CHANGED, JUST DELAYED.

Are the caps working? I'll keep with them, just in case. Clearly rational typing has not worked.

MORE FROM ANANDTECH:
"For the immediate future, as we mentioned earlier Larrabee Prime is still going to be used by Intel for R&D purposes, as a software development platform. This is a very good use of the hardware (however troubled it may be) as it allows Intel to bootstrap the software side of Larrabee so that developers can get started programming for real hardware while Intel works on the next iteration of Larrabee."

HE CLAIMS IT'S A GOOD THING THE CURRENT TECH WAS SHELVED TO R&D, AS THIS WILL GIVE DEVELOPERS A CHANCE TO LEARN TO USE IT BEFORE IT LAUNCHES TO CONSUMERS.

LAST BIT FROM ANANDTECH:
"For that matter, Since the Larrabee project was not killed, it’s a safe assumption that any future Larrabee chips are going to be based on the same architectural design. The vibe from Intel is that the problem is Larrabee Prime and not the Larrabee architecture itself. The idea of an x86 many-cores GPU is still alive and well."

"SINCE THE PROJECT WAS NOT KILLED" and "THE IDEA OF AN x86 MANY-CORES GPU IS STILL ALIVE AND WELL."

Still not believing it? Then there's no hope for you. You're usually not a dense poster. I don't understand why it's so hard for you to understand this.


I'll break it down, then: Intel was not getting the results they wanted from Larrabee. The project was delayed to the point that it was no longer significantly better than standard GPUs on the market, yet would cost more. Furthermore yields were not really anywhere near what they needed to be. Rather than push out a half-assed project when the tech wasn't quite ready Intel put the first-gen on the shelf. No need to release it. They have huge pockets but desperately want a chunk of the GPU market. No point in releasing something that wasn't perfect - you only get one first impression.
Gee, sounds like Fermi, doesn't it? Delayed to the point that, though competitive (and likely the fastest single-board solution), it's dollars/performance ratio isn't quite right. The fabrication process also isn't quite right. Yet, unlike Intel, Nvidia is pushing Fermi through. Intel, wiser and with deeper pockets, is waiting a generation (or two) before bringing this to consumers.

 
-------------
Music for the discerning:
http://www.deathwishinc.com
http://www.hydrahead.com
http://www.painkillerrecords.com
Reply Quote Edit Delete Report
 
45. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 19:39 The PC Warrior
 
Beamer wrote on Mar 13, 2010, 17:09:

Furthermore, find me one quote from Intel that says it's fully cancelled.

From your own links:

Despite the fact working samples had already been shown, that Intel executives had continually promised it would change everything, and that it was planned for release in 2010, the Intel board of directors has decided to shelve Larrabee, its consumer graphics chip. Instead Larrabee will now be made available as a software development platform: Larrabee New Instructions (LRBni).

And nothing about launching a standalone discrete graphics card in the future. Nothing. No dates, no hardware, no results. Show me the money Beamer. You talk a good game, I don't see text saying "we will be launching a Larrabee discrete graphics card in 20xx". Unless you can post that, don't post anymore worthless filler because it's annoying have to read a bunch of stuff just to see you lied again.

See, there's the hate. All your posts about Nvidia have been angry. "Buying off developers." Again, you're a fool if you don't think ATI/AMD is paying developers to optomize to their hardware. You're a fool if you don't think Intel does it. You're a fool if you don't think this kind of thing happens in every industry.

All you've done is repeat yourself and attack me. Funny, I don't see "The way ATI meant it to be played" on game intros, I don't see exclusive AMD graphics card features in games nor do I see ATI inking agreements to disable competitor functionality and calling them "marketing deals". Address those instead of trying to play internet psychologist. The only anger here is yours, you keep lashing out instead of producing anything.

I'd hate for anyone to think you actually had a clue what you were talking about
 
Reply Quote Edit Delete Report
 
44. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 18:00 ^Drag0n^
 
ramerco wrote on Mar 12, 2010, 23:46:
GPU hardware accelerated physx is proprietary. The physx sdk and drivers can run on systems without hardware acceleration, just not as well.

Yeah, wouldn't it be great if that was true.

Unfortunately, some dickwad at nV thought it would be cool to disable hardware accelerated physx on NV hardware if you have an ATI IGP or GPU also present. Including the standalone physx PPU cards.

Not only is it draconian, it's asinine, as it actually hurts nV from the standpoint that people owning ATI IGP or GPU cards that might want a dedicated nV PPU/GPU for physics can't do it anymore.

Brilliant marketing. Right up there with some of the recent DRM runs.

I've been running green pretty much 100% since 1998, but this BS, along with the obvious technical problems they have been having this year getting out their latest designs (due later this month) has made me decide to go red on my next GPU.

And yes, I know there's a "hack" that mods the Physx drivers to eliminate the "feature," but I don't think it should ever come to that. Nvidia shouldn't be retarded enough to screw people that legitimately want to use their accelerated physx processor together with someone else's GPU out of the box.

^D^

PS: Let me be clear: I could care less if Physx runs on an ATI card; NV owns it, and more power to them. The issue here is NV has killed the hardware acceleration on legitimate hardware if ATI GPU acceleration is present.

I sure as hell won't cry a tear, though, when MS finishes their DirectX physics API and puts an end to this BS.

This comment was edited on Mar 13, 2010, 18:21.
 
Avatar 55075
 
"Never start a fight, but always finish it."
Reply Quote Edit Delete Report
 
43. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 17:40 I've Got The News Blues
 
Beamer wrote on Mar 13, 2010, 17:09:
I'd hate for someone to read your posts and think Nvidia is doing something unethical and uncommon
Whether or not something is uncommon doesn't mean it's good for the consumer. Embracing open standards is good for the consumer. Then companies have to compete for the consumer's business on the real merits of their products rather than by creating artificial limitations like what Nvidia is doing with PhysX. While you and I may disagree on what is ethical, I believe that Nvidia is definitely being unethical in disabling the PhysX capabilities of its own video cards when its competitors' video cards are used in the same PC. That's blatantly anti-competitive.
 
Reply Quote Edit Delete Report
 
42. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 17:09 Beamer
 
Intel says Larrabee is dead as a consumer product, officially on the record.

For the love of god did you click the links I pasted?
Particularly the one from the engineer working on the project? He spells out in no uncertain terms that Larrabee is NOT dead as a consumer project, only the first generation was.
Furthermore, find me one quote from Intel that says it's fully cancelled. Everything came from a phone conversation. Here's a follow-up quote:
"Larrabee silicon and software development are behind where we hoped to be at this point in the project. As a result, our first Larrabee product will not be launched as a standalone discrete graphics product.
Note that he says "Our first Larrabee project will not be launched as a standalone discrete graphics product." He does not say the Larrabee consumer project is dead.
http://www.bit-tech.net/news/hardware/2009/12/07/intel-larrabee-cancelled/
http://news.cnet.com/8301-13924_3-10409715-64.html
Wow. Seriously. Just about every site picked up the correction, but you're unable to find it.

who said i hate nvidia? nice strawman there. maybe i like nvidia actually producing worthwhile consumer products instead of pushing useless, proprietary standards and buying off developers.

See, there's the hate. All your posts about Nvidia have been angry. "Buying off developers." Again, you're a fool if you don't think ATI/AMD is paying developers to optomize to their hardware. You're a fool if you don't think Intel does it. You're a fool if you don't think this kind of thing happens in every industry.
I have no real love for Nvidia, but I don't hate them for not being competitive this generation and really hope they bounce back and keep the game between them and ATI competitive.
It's no strawman - you're spewing tons of hate here.


Actually, hate and FUD. I'd hate for someone to read your posts and think Nvidia is doing something unethical and uncommon, or think that Intel's Larrabee is dead.
 
-------------
Music for the discerning:
http://www.deathwishinc.com
http://www.hydrahead.com
http://www.painkillerrecords.com
Reply Quote Edit Delete Report
 
41. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 16:25 The PC Warrior
 
Beamer wrote on Mar 13, 2010, 02:01:
No, Larrabee G1 is dead. How can you not find any information on this? It's in the Wikipedia article, even. Here's a quote from Anandtech:

Intel says Larrabee is dead as a consumer product, officially on the record. I can find about 15 sources to contradict what you said on google, just google "larrabee cancelled" or "larrabee dead on arrival" if you dont believe me.

Trust me, I've spoken with Intel. They acknowledge that their PR did a terrible job with this one. The choice of words was terrible.

hahahaha yeah ok internet guy, we will all just trust you. maybe next time youre talking to intel, you can get them to put that in writing somewhere for you.

Not sure why you seem to hate Nvidia so much. Never understood choosing a side in this battle. Go with whomever is giving us the power. We should all hope Fermi works out because it has the potential to give us much more power than what AMD is pushing. Beyond that, if Fermi fails then Nvidia will probably go with it and we'll be left with a one player market. You really want ATI to stand alone?

who said i hate nvidia? nice strawman there. maybe i like nvidia actually producing worthwhile consumer products instead of pushing useless, proprietary standards and buying off developers.
 
Reply Quote Edit Delete Report
 
40. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 15:43 I've Got The News Blues
 
Agent.X7 wrote on Mar 13, 2010, 14:39:
You guys are aware that PhysX has a software based version that runs on consoles and PCs without acceleration, right? (Just like Havok)
Yes, and we're also aware that CPU-based PhysX runs like shit. Apparently you are not aware of that. If you have ever tried to run the PhysX pack for UT3, Warmonger, or Cellfactor on a PC with an AMD/ATI video card you would see that.

Everyone seems to have their panties in a wad about accelerated PhysX, which NVidia does on its video cards on PCs. Oh no, an added benefit that NVidia card owners get, boo hoo.
The reason why people are pissed about this is because there is no technical reason why PhysX can't run on an AMD/ATI graphics card. None! Nvidia is simply creating an artificial monopoly on GPU-accelerated PhysX by refusing to port it to an open standard like OpenCL. The fact that Nvidia won't even allow its video cards to be used for PhysX in a PC which has an AMD/ATI video card demonstrates its real monopolistic intentions.
 
Reply Quote Edit Delete Report
 
39. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 14:54 Beamer
 
nVidia knows they're not going to win on that this generation, so they've been pushing their proprietary, closed features as an alternative. Hence we see nVidia trying to dominate the conversation with talk of 3DVision and PhysX.

Unfair to say.
Nvidia has been pushing PhysX long before they were wiped out of this generation. In fact, they were pushing it when they were the market leaders.
As for 3DVision and/or Eyefinity, c'mon, we've had buzzwords like this coming from GPU makers since the S3 Verge days. Nothing new. Sometimes they pan out and we get Crossfire. Sometimes they don't.
 
-------------
Music for the discerning:
http://www.deathwishinc.com
http://www.hydrahead.com
http://www.painkillerrecords.com
Reply Quote Edit Delete Report
 
38. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 14:52 Beamer
 
Everyone seems to have their panties in a wad about accelerated PhysX, which NVidia does on its video cards on PCs. Oh no, an added benefit that NVidia card owners get, boo hoo. Don't forget that ATi/AMD tried the same shit, and we'd be having accelerated Havok if they didn't fuck it up. Now they claim to be the good guys because they failed to get people to buy into accelerated Havok and are doing the open source accelerated physics.

Bing bing bing. And, as I said earlier, any of you that have played an Unreal Engine game have used PhysX. Same with Unity.
It was Intel that tried to hardware accelerate Havok, though, with Havok FX. They bailed on it, deciding it wasn't worthwhile. Larrabee is part of that reason - they figured a GPGPU that can handle physics is better than some other solution.
I agree.
 
-------------
Music for the discerning:
http://www.deathwishinc.com
http://www.hydrahead.com
http://www.painkillerrecords.com
Reply Quote Edit Delete Report
 
37. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 14:39 Agent.X7
 
TheDevilYouKnow wrote on Mar 12, 2010, 22:25:
I agree with Creston. The fact that it runs on NVIDIA hardware only
by definition makes it proprietary.

You guys are aware that PhysX has a software based version that runs on consoles and PCs without acceleration, right? (Just like Havok)

Check some console games. Just on my shelf I have Uncharted 2 that uses Havok, and Terminator Salvation that uses PhysX. It's just the base API for doing physics in game. It's a choice between software, just like openGL or DirectX.

Everyone seems to have their panties in a wad about accelerated PhysX, which NVidia does on its video cards on PCs. Oh no, an added benefit that NVidia card owners get, boo hoo. Don't forget that ATi/AMD tried the same shit, and we'd be having accelerated Havok if they didn't fuck it up. Now they claim to be the good guys because they failed to get people to buy into accelerated Havok and are doing the open source accelerated physics.

This comment was edited on Mar 13, 2010, 14:47.
 
Avatar 23400
 
Origin - JStarX7
STEAM - Agent.X7
PSN - JStar_X7
Xbox Live - Agent X7
Reply Quote Edit Delete Report
 
36. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 13:43 ForgedReality
 
Eldaron Imotholin wrote on Mar 13, 2010, 12:35:
I have not yet seen anything PhysX can that Havok or whatever can't do. Seriously, I've even watched PhysX promo trailers with showing splitscreen the game with and without PhysX. It was like playing "Find It" on hardcore.

Fuck PhysX. No one would really miss it if it would simply stop existing.
Have never seen a Havok game.

Or if it used Havok, never noticed.

PhysX is noticeable, and it was really well implemented in Batman:AA.

Dunno where you're seeing PhysX-level Havok effects, but I'd be interested to find out.

finga wrote on Mar 13, 2010, 12:21:
Seems pretty simple right now. ATI is delivering better raw performance for the money at several price points, and nVidia knows they're not going to win on that this generation, so they've been pushing their proprietary, closed features as an alternative. Hence we see nVidia trying to dominate the conversation with talk of 3DVision and PhysX.

For me, Eyefinity is just as interesting of a feature as 3DVision in-game, but 1) 3DVision can cause headaches and offers no gameplay advantage, and 2) Eyefinity setups can be used outside of games. On the physics front, I see so many games out there only using 25-50% of a quad core CPU. Why would I use the component that's most often the bottleneck in games (the GPU) for that calculation when the CPU could be used? Unfortunately, developers still haven't figured that bit out yet and are adding in things like PhysX support only for nVidia cards when that could have easily gone on the CPU.

Show me a CPU with hundreds of cores, and we'll be in business.
 
Avatar 55267
 
Reply Quote Edit Delete Report
 
35. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 12:37 Joss
 
Beamer wrote on Mar 13, 2010, 02:01:
Never understood choosing a side in this battle. Go with whomever is giving us the power.

And I blame that on football. Yeah, we should be thankful we have a choice here. I'll ride whatever wave comes to us.
 
Reply Quote Edit Delete Report
 
34. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 12:35 Eldaron Imotholin
 
I have not yet seen anything PhysX can that Havok or whatever can't do. Seriously, I've even watched PhysX promo trailers with showing splitscreen the game with and without PhysX. It was like playing "Find It" on hardcore.

Fuck PhysX. No one would really miss it if it would simply stop existing.
 
Avatar 15836
 
Playing: Skyrim, World of Warcraft.
Future: Dead Space 3.
Reply Quote Edit Delete Report
 
33. Re: NVIDIA Responds to ATI PhysX Comments Mar 13, 2010, 12:21 finga
 
Seems pretty simple right now. ATI is delivering better raw performance for the money at several price points, and nVidia knows they're not going to win on that this generation, so they've been pushing their proprietary, closed features as an alternative. Hence we see nVidia trying to dominate the conversation with talk of 3DVision and PhysX.

For me, Eyefinity is just as interesting of a feature as 3DVision in-game, but 1) 3DVision can cause headaches and offers no gameplay advantage, and 2) Eyefinity setups can be used outside of games. On the physics front, I see so many games out there only using 25-50% of a quad core CPU. Why would I use the component that's most often the bottleneck in games (the GPU) for that calculation when the CPU could be used? Unfortunately, developers still haven't figured that bit out yet and are adding in things like PhysX support only for nVidia cards when that could have easily gone on the CPU.
 
Reply Quote Edit Delete Report
 
52 Replies. 3 pages. Viewing page 1.
< Newer [ 1 2 3 ] Older >


footer

Blue's News logo