Crysis DirectX 9 & 10 MP Performance

Total Crysis offers an update on how multiplayer will work in Crysis (thanks Voodoo Extreme). A number of topics are covered, including how DirectX 9 servers will be limited compared to DirectX 10 servers, not because of software, but hardware:
DX9 vs. DX10 – The endless question
To shed some light into one of the most discussed topics regarding Crysis multiplayer I would like to explain you the differences between Crysis MP DX9 and DX10.

As for the DX9 version we won’t have physics and day and night cycle in-game. That means you won’t be able to shoot down trees and/or alter any other objects than vehicles on the map. Additionally the time of day setting doesn’t change dynamically. This is caused due to the tremendous server load such physics might cause on crowded gaming servers. Still you will be able to experience maps with different time of day settings since the maps can be altered in the Sandbox2 Editor.

Rather than providing the community partially working features we limit this for the DX10 version only. Due to the strong hardware available with DX10, server load is less and performance is increased. This ensures the pure physics and day and night cycle experience without any limitation.

Gamers with a DX10 card are able to play on DX9 servers, but with the limitation of the respective server. Vice versa it is not possible for gamers with DX9 cards to play on DX10 servers due to the limited features.
View : : :
107 Replies. 6 pages. Viewing page 3.
Newer [  1  2  3  4  5  6  ] Older
67.
 
Re: No subject
Sep 14, 2007, 11:17
67.
Re: No subject Sep 14, 2007, 11:17
Sep 14, 2007, 11:17
 
And yet his opinion is anything above 3GB is a "waste".
Wrong.
Read the link there is a subtle difference between your quadcore waste and my 4GB waste.

This comment was edited on Sep 14, 11:31.
66.
 
Re: No subject
Sep 14, 2007, 11:06
66.
Re: No subject Sep 14, 2007, 11:06
Sep 14, 2007, 11:06
 
@ >u
I wouldn't call the following official description from the news post above "great" physics. It sounds like pretty damn gimped or non-existent physics to me.

As for the DX9 version we won’t have physics and day and night cycle in-game. That means you won’t be able to shoot down trees and/or alter any other objects than vehicles on the map...Rather than providing the community partially working features we limit this for the DX10 version only


I explained the scenario, but yes the wording is bad as in "won't have physics". Do you seriously believe there will be "No" physics. I mean they can't even do what Far cry did and every other game for the last several years??? Come on, I have no doubt they are talking the big physics explosions, things like Ageia always tries to show off to make itself look special. In this case, falling trees, branches, blowing up huts, leaves moving as you walk through, etc.
Do you think driving the jeep won't have physics attached, etc.?

65.
 
Re: No subject
Sep 14, 2007, 11:01
65.
Re: No subject Sep 14, 2007, 11:01
Sep 14, 2007, 11:01
 
Unless you are going to be using 64bit xp or vista, 4 GB of RAM is worthless.

And yet his opinion is anything above 3GB is a "waste".
Wrong.
It's no more a waste than a quad core, whose cores you will never fully utilize.

So what if you have 512MB unused, maybe you may want to upgrade to Vista 64 in the next year or two. Or maybe you want to use /3GB for certain games that are large address aware, leaving more RAM available to the game instead of paging. Besides x64 deals with the same reserved memory address issues, not being able to fully utilize 4GB RAM either, so you can say 4GB on x64 is "wasting" RAM too.
Combining 2x1GB and 2x512 is also silly and has potential pitfalls. 4x1GB or 2x2GB is not the end of the world and even will have advantages over 3GB RAM limit in certain current games already.


This comment was edited on Sep 14, 11:08.
64.
 
Re: No subject
Sep 14, 2007, 10:59
Enahs
 
64.
Re: No subject Sep 14, 2007, 10:59
Sep 14, 2007, 10:59
 Enahs
 
I wouldn't call the following official description from the news post above "great" physics. It sounds like pretty damn gimped or non-existent physics to me.

That is for MP only.

Alternating Logo (GreaseMonkey script):
http://www.ualr.edu/szsullivan/scripts_/BluesNewslogo.user.js
I am free of all prejudice. I hate everyone equally.
- W. C. Fields
Avatar 15513
63.
 
No subject
Sep 14, 2007, 10:55
63.
No subject Sep 14, 2007, 10:55
Sep 14, 2007, 10:55
 
just to jump in here real quick.

I have to respectfully disagree with you on the 24" monitor statement theyarecomingforyou.

"For a single 8800GTX get a 22" widescreen"

I've been running my 24" Dell 2405FPW for several years now... and I wouldn't trade it for anything! (ok, except maybe a 2407FPW) I'm only running a 7950GX2, and I run my games fine.
In every situation, I can run a game the way I want at 1920x1200 with a little AA, or 1680x1050 w/decent AA quite well. My roommate has a 2405FPW on a 7800GTX, and he has no problems either.

The size and resolution come in handy for me so often that I just can't see working/gaming without it.
It also has the convenience factor of having tons of inputs, including tv inputs, which come in handy when working on tvs, pcs, xboxs, etc. When my buddies come over for a quick xbox lan, my 24" monitor instantly becomes a damn handy extra tv.

maximus0402, make what ever decision you want, but just know that an 8800 GTX will push a 24" monitor JUST FINE. However, your hopes of "be(ing) able to run on max settings for now and future games in the next year or two" is just unfounded and won't happen. Even if you buy a 17" lcd with a max res of 1280x1024, you won't run every game for the next two years at MAX settings. You just WONT. Software will always push the hardware envelope in a way that will keep the hardware humble.
With that said, you should have no problem running your games VERY NEARLY all the way up. There's always a happy medium when it comes to graphics settings, a give here, take there kind of thing. Something you have to remember (or clarify) is what "MAX settings" really means. That means very different things to different people. And with the games adding more technologies all the time, there's more and more for you to consider when "maxing out" your graphics settings. Games now-a-days have shader, physics, aa, texturing, shadowing, fog, draw distance, Diffuse, Bump, Specular, AF, (etc) settings, and keeping them maxed out will always be a chore. But finding a mix you're comfortable with should be very easy to do with an 8800GTX (at 1920x1200) for several years.
Still, I would say opting out of getting a 24" monitor simply because you can't blast every game out of the park at 1920x1200 for the next two years is pretty foolish, in my opinion, because it's simply not going to happen anyway, regardless of resolution.

Something else to consider is if you are that concerned with getting the most insane performance possible, wait a few more months for nvidia to unveil their new gpu line that should hit sometime this fall, or maybe winter. As far as SLI goes, I PERSONALLY don't think it's worth it. It certainly has it's performance advantages, but I just personally think those advantages come too little too late too much of the time. This is because SLI is so very driver/software dependent. This is normally fine when your cards are new, as your equipment is their flagship product and keeping the support up is paramount for them, but once your card is a generation old, support for you goes on the back burner behind the new king, and you often won't find proper SLI support for the newest game you just bought for weeks or months. And then, in the end, unless you're trying to push insane graphics at insane resolutions (2560x1600), the gain you might eventually get will only really apply in that ultra high end space. And personally, I don't think being in that space is worth the cost of entry, and then the on going cost of searching for sli profiles/settings that work best for that newest game that 'doesn't have native support yet'. Again, this is all just what I've seen from my experiences. Your mileage (and tolerance) may differ.

My X2-4800+ and 7950gx2 still pushes my 24" quite well... better than I could have ever hoped for considering how long ago I bought them. I don't think you could possibly be disappointed in your purchase either.

62.
 
Re: No subject
Sep 14, 2007, 10:50
62.
Re: No subject Sep 14, 2007, 10:50
Sep 14, 2007, 10:50
 
Stalker has day/night cycle but only in single player (I don't think it's in multiplayer).

If you take advantage of DX10 like this, why not take advantage of multicore processors?

61.
 
Re: No subject
Sep 14, 2007, 10:49
>U
61.
Re: No subject Sep 14, 2007, 10:49
Sep 14, 2007, 10:49
>U
 
DX9 will have great physics, so lets not get crazy now.

I wouldn't call the following official description from the news post above "great" physics. It sounds like pretty damn gimped or non-existent physics to me.

As for the DX9 version we won’t have physics and day and night cycle in-game. That means you won’t be able to shoot down trees and/or alter any other objects than vehicles on the map...Rather than providing the community partially working features we limit this for the DX10 version only


60.
 
Re: No subject
Sep 14, 2007, 10:48
60.
Re: No subject Sep 14, 2007, 10:48
Sep 14, 2007, 10:48
 
It will have quad core, it will have 3 to 4 gig of ram, it will have vista and xp
Unless you are going to be using 64bit xp or vista, 4 GB of RAM is worthless. By the way I'm not just going to make this assertion without backing it up with some evidence. The best explanation of why 4 GB is a waste on 32 bit vista or xp I found here:
http://www.dansdata.com/askdan00015.htm


59.
 
Re: ...
Sep 14, 2007, 10:47
59.
Re: ... Sep 14, 2007, 10:47
Sep 14, 2007, 10:47
 
Do what F.E.A.R. and some other games do and build a peformance test into the game. Multi-core CPU support is certainly detectable, and the game can run a benchmark level to determine if the PC can handle the extra effects.

As was pointed out below, the mere presence of DirectX 10 does NOT mean that the PC can actually handle the advanced effects. I seriously doubt that a PC with a low-end DirectX 10 compatible video card such as an ATI 2400XT or Nvidia 8400 is going to run the game better than someone with a top of the line DirectX 9 card like anb Nvidia 7950 or ATI X1950 especially when it comes to physics.


I understand that, but as simple as you make it sound, their are many other pitfalls to doing a pure HW seperation. An API is a far easier line of demarcation, from a programming challenge perspective.

Yes I know DX10 will have some weaker HW, that is obvious. But it will have the true API benefits available, if they are leveraged, and it will have a geometry shader. Things like the DX9 batch issues could have problems with all the objects.
Like I tried to explain, there is many reasons a DX10 cutoff makes sense from a development complexity scenario.

Unfortunately no other game has really shown what DX10 can do that DX9 cannot, maybe this is the game.

This comment was edited on Sep 14, 10:49.
58.
 
Re: No subject
Sep 14, 2007, 10:42
58.
Re: No subject Sep 14, 2007, 10:42
Sep 14, 2007, 10:42
 
When it comes to physics well HL2 did an ok job without DX10
& so have many other titles in the past. SO the hype about quand core for Physics was BS? Now we need DX10 to have the physics effects.


You will still have physics in DX9. You will probably have great physics in DX9.

You don't need DX10 for physics effects, you need DX10 hardware to render the huge amount of objects in their enhanced explosions. DX10 has a geometry shader that can render procedural geometery and it also allows particle systems to be completely independent of the CPU. Furthermore the DX9 batch issue has a limit on objects rendered, where DX10 can do many more thousands. Etc, etc, etc

So they could actually be the first developer to actually leverage the DX10 API and do some things that cannot be done in DX9 (instead of all other current so called, DX10, games). That is horrible, really horrible.

DX9 will have great physics, so lets not get crazy now.

57.
 
Re: ...
Sep 14, 2007, 10:41
>U
57.
Re: ... Sep 14, 2007, 10:41
Sep 14, 2007, 10:41
>U
 
Can you just detect a fast enough CPU and GPU on either API and switch on the extra effects?
Do what F.E.A.R. and some other games do and build a peformance test into the game. Multi-core CPU support is certainly detectable, and the game can run a benchmark level to determine if the PC can handle the extra effects.

As was pointed out below, the mere presence of DirectX 10 does NOT mean that the PC can actually handle the advanced effects. I seriously doubt that a PC with a low-end DirectX 10 compatible video card such as an ATI 2400XT or Nvidia 8400 is going to run the game better than someone with a top of the line DirectX 9 card like an Nvidia 7950 or ATI X1950 especially when it comes to physics.

This comment was edited on Sep 14, 10:45.
56.
 
Re: No subject
Sep 14, 2007, 10:39
56.
Re: No subject Sep 14, 2007, 10:39
Sep 14, 2007, 10:39
 
the only people i hear bitching about vista either a). haven't tried it or b), tried it with 5 year old hardware.
a) wrong and,
b) wrong

I have to use it sometimes for work, on top of the line systems. And it sucks. It's a horribly bloated, DRM-locked, clunky, no-real-reason-to-upgrade-aside-from-MS-wanting-a-new-profit-point heap with a few nice improvements over XP, and a whole slew of bad changes which shouldn't have happened.

If games are truly going in this direction, then this is where I get off the ship. I'll play a very limited selection (which I do anyway) which are offered on the systems I choose to support, and I'll wait it out on the other side of the fence, for the day when everyone has had just about enough of this crap, and a Firefox-like mass migration to another OS (be it Linux, Mac OS, React, etc.) takes place for both users and developers. As Linux is already part of the Epic and id scene, and soon to be on Valve's plate as well, it shouldn't be long before the little guys follow suit and the option to migrate is real. And then bye-bye to these Microsoft shackles for once and for all.

Edit: Almost forgot - Fuck EA!


This comment was edited on Sep 14, 10:41.
55.
 
Re: No subject
Sep 14, 2007, 10:30
55.
Re: No subject Sep 14, 2007, 10:30
Sep 14, 2007, 10:30
 
Exactly Sempai!

People jsut have to read articles about the STALKER expansion or clear sky or whatever it's called...

According to the what I've read it looks just as impressive as any DX10 title. I did not see the comparaison first hand but I tend to trust the source of this particular article.

When it comes to physics well HL2 did an ok job without DX10
& so have many other titles in the past. SO the hype about quand core for Physics was BS? Now we need DX10 to have the physics effects.

They should have gone with Physx..I would rather pay 150$ for the card based on what I saw with GRAW2(physx Island reviews) than pay 230 for the version of VIsta that I would be interested in...but that's just me most people would say both the physx card & vista are a waist of $$$.

Avatar 19242
54.
 
Re: No subject
Sep 14, 2007, 10:30
>U
54.
Re: No subject Sep 14, 2007, 10:30
Sep 14, 2007, 10:30
>U
 
When you bought xp also happens to be irrelevant.

Of course it's relevant. If my OS is nine months old, why should I have to replace it to run a few games which could deliver the same capabilities on XP if written using OpenGL or other techniques.

Money does make the world go round though, and it's not unreasonable to expect consumers to have to upgrade every 6 or so years.
In five years it might be time to move on. Otherwise expecting people with one year old PC's to buy a new OS to play games is unreasonable. Right now nothing else requires Vista not even MIcrosoft's own applications.


53.
 
Re: ...
Sep 14, 2007, 10:30
53.
Re: ... Sep 14, 2007, 10:30
Sep 14, 2007, 10:30
 
Well Is there Direct X 9 and 10 out of the same box? Or are there 2 different SKU's for XP and Vista?

Honestly for me the big thing with Crysis is Single Player. Far Cry MP was ok but nothing overly new except pretty scenery.

I guess my question is, why does Day & Night Cycles in MP need DX10?

Avatar 12670
52.
 
Re: ...
Sep 14, 2007, 09:57
52.
Re: ... Sep 14, 2007, 09:57
Sep 14, 2007, 09:57
 
harcore, I imagine in the half a million interviews and discussions after the game comes out, there will be many. However asking for it now is, well, like asking any other question about crysis, not likely to get answered any time soon.

51.
 
Re: ...
Sep 14, 2007, 09:45
nin
51.
Re: ... Sep 14, 2007, 09:45
Sep 14, 2007, 09:45
nin
 
theyarecomingforyou, you had problems with oblivion with that?! I dunno.. maybe it really was because I am using a Q6600, but when I got my 8800gtx ultra and ran oblivion, it had zero problems with graphics at full. Infact I had to keep combing through the config files and config menus to make sure it really was set with all graphics high, but (and this was in vista, with the bioshock drivers) oblivion was a push over for my computer.

I just reinstalled that last night, and it's zooming right along so far...

edit: Between that and his mouse issue in QW, I think tacfu's system is borked...




-----------------------------------------------------
Bioshock: "You're soon beset by deranged flappers and dandies, like Jay Gatsby's party guests gone feral."
This comment was edited on Sep 14, 09:46.
50.
 
Re: ...
Sep 14, 2007, 09:37
50.
Re: ... Sep 14, 2007, 09:37
Sep 14, 2007, 09:37
 
this move strikes me as unnecessary and it sounds like they've received a nice fat cheque from Microsoft.

I think Crytek's explanation needed more meat to it , so techies could get their head around it.

Almost every developer that does a DX10 path gets compensated, so what. The problem is most other games have very little benefit or tangible differences (no extra assets) to use between DX9 or DX10, they just add it as fluff and call it DX10. The reason DX10 cannot take off, is because of req'd DX9 support currently and being able to keep them MP friendly online forces devs to keep DX10 support limited. To be able to make DX10 and DX9 MP friendly requires that your DX10 changes are nothing tangible as to changes in the gameworld (most games). Frankly, I am tired of some of these recent DX10 paths, they are a joke and don't take advantage of the API's or offer anything really beneficial at all.

Crtytek is being faulted because they have built in enhancements into the engine for higher end HW, and chose DX10 as the line of demarcation. Now someone that is a programmer of some years of experience may see why this whole dilemma comes up this way and realizes why they can't just add new functionality based on HW. Which BTW, no developers give real tangible extras to high end HW, you get the same limitations as the 4 year old HW to the tangible game world in most games.

They added enhancements for higher end physics, but can't you just seperate by HW(multicore or GPU) as easy as that sounds?
Can you just detect a fast enough CPU and GPU on either API and switch on the extra effects?
How exactly?
Where do you make the cutoffs in HW for each possible platform combination?
That is also two major codepaths for each render path and API and is a huge potential bug fest and makes tracking them down, very difficult.
How do you then seperate servers then for the people that have more objects and assets due to their HW cutoff?
How do people know which server to join if they reached the higher HW level?
Their is also likely DX10 API functions used to handle more objects in explosions, where DX9 would choke on some of that.

For once, someone looks to be actually leveraging DX10!
The penalties of having to do it are you can't play MP with those effects, which other devs avoid by making a vanilla weak DX10 render path.

DX10 as a cutoff, makes this easiest for developer and user really.

The major reason this is coming up now is that Crytek seems to have gone farther with their DX10 renderer and has actually had to face the ultimate problem that any other developer will have to, who implements anything in DX10 beyond the usual crappy effects we have seen thus far(which could really be done in DX9), and that is, that DX9 and DX10 will not be compatible online if you really take advantage of the DX10 API enhancements and change the game worlds assets between the two API's. If I see a fallen tree which is hiding an enemy, and the enemy sees no tree. it's a problem.

I applaud Crytek for looking like a developer that is actually trying to really leverage DX10, for once! As well as give people with higher HW some benefits, something no other developers do beyond the usual basic crap of enhanced shadows, lighting, textures.

Props to Crytek, but they need a better explanation to the reasons.


This comment was edited on Sep 14, 09:49.
49.
 
Re: ...
Sep 14, 2007, 09:36
49.
Re: ... Sep 14, 2007, 09:36
Sep 14, 2007, 09:36
 
theyarecomingforyou, you had problems with oblivion with that?! I dunno.. maybe it really was because I am using a Q6600, but when I got my 8800gtx ultra and ran oblivion, it had zero problems with graphics at full. Infact I had to keep combing through the config files and config menus to make sure it really was set with all graphics high, but (and this was in vista, with the bioshock drivers) oblivion was a push over for my computer. It also isn't all that pretty anymore. I liked bioshock more, but then I love bioshocks decor style and I'm a total sucker for water effects.

48.
 
...
Sep 14, 2007, 09:26
48.
... Sep 14, 2007, 09:26
Sep 14, 2007, 09:26
 
I thought for older games like oblivion and older I may be able to slide with MAX settings at that resolution with a decent framerate
Ha. Sorry, but Oblivion is just a beast - full stop. I thought the same thing with my system but I was wrong. I'll just list my system before starting:

E6750 2.67GHZ (@3.32GHz)
8800GTS 513/800 (@650/875)
2GB RAM
Raptor 150GB 10,000RPM
22" Widescreen (1680x1050)

It's not as high end as your intended setup but it's somewhat similar. Oblivion runs pretty poor, all things considered - it seems the engine is limited and doesn't improve performance much when you throw topend graphics and multi-core processors at it. Still, obviously most games run well but certainly I wouldn't have gone for a 24" monitor myself - whilst I would LOVE one for desktop use it would just bring my graphics card to its knees. Sure you can get SLI but it doesn't support all games equally (or at all) and SLI support in Vista isn't as good. If SLI increased performance in every game between 75-100% then it would be brilliant but some games just don't run and others actually exhibit a performance loss.

For a single 8800GTX get a 22" widescreen.

- - - - - - - - - - - - - -
Founder of the "I Hate Smiley Fitz" society

Remember: Riley has autism. He has trouble communicating, and in an overstimulating
environment, he can get frightened and run away, leaving his parents frantic. - Auburn
"The price of freedom is eternal vigilance."
Avatar 22891
107 Replies. 6 pages. Viewing page 3.
Newer [  1  2  3  4  5  6  ] Older