An update to the
Unreal Technology Page
explains some of the Direct3D anomalies that people have been running into (thanks
Unreal Universe). Here's what it
says:
We've been looking at the feedback on Direct3D performance and investigating
some strange reports. Recently, we've mainly been testing on 96-meg and 128-meg
machines (I have a Celeron 400, Jack Porter has a K6-2 450). On these machines,
TNT1 performance is good -- average 28 fps at 648x480, 25 fps at 800x600. The
TNT2 performance is significantly better.
However, upon removing some RAM and testing Direct3D on a 64-meg K6-2, the "precache"
time increased by about 5X, and performance dropped to a few frames per second.
These performance drops don't occur in the software renderer, and don't occur
in Glide. Something is going wrong between Unreal, Direct3D, and the TNT's Direct3D
driver, and we're investigating.
Overall, the feedback indicates a very wide variance in performance among TNT
users, much more so than with any other card. Our internal testing has indicated
this too; for example, we've found (and worked around) a lot of driver bugs
that only happen on one machine, and not others with otherwise similar configurations.
Don't Try This At Home Dept.: Some TNT users have reported that tweaking their
BIOS's "AGP Aperture Size" improves performance on 64-meg machines.
We have tried this and couldn't find any differences on our 64-meg test machine.
Others report that the Creative Labs unified drivers (with TNT Glide support)
outperform Direct3D on their cards. If anybody finds definite improvements or
workarounds, or has insight into what's happening, please email utbugs@epicgames.com
and let us know.