Send News. Want a reply? Read this. More in the FAQ.   News Forum - All Forums - Mobile - PDA - RSS Headlines  RSS Headlines   Twitter  Twitter
Customize
User Settings
Styles:
LAN Parties
Upcoming one-time events:

Regularly scheduled events

More NVIDIA vs. Futuremark

Nvidia accused of fudging tests (thanks Mike Martinez) is the report of the latest salvos fired in what's become a bit of a feud between the graphics chipset creators and the benchmark designers (story). In related news, the Futuremark Website (thanks Tony!!!) has a 3DMark03 patch for download with the following note: "We have now established that NVIDIA’s Detonator FX drivers contain certain detection mechanisms that cause an artificially high score when using 3DMark®03. We have just published a patch 330 for 3DMark03 that defeats the detection mechanisms in the drivers and provides correct results." Here's a bit of point/counter-point from the CNET article:

"Recently, there have been questions and some confusion regarding 3DMark 03 results obtained with certain Nvidia" products, Futuremark said in the statement. "We have now established that Nvidia's Detonator FX drivers contain certain detection mechanisms that cause an artificially high score when using 3DMark 03."

A representative at Nvidia questioned the validity of Futuremark's conclusions. "Since Nvidia is not part of the Futuremark beta program (a program which costs of hundreds of thousands of dollars to participate in), we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer," the representative said. "We don't know what they did, but it looks like they have intentionally tried to create a scenario that makes our products look bad."

View
66 Replies. 4 pages. Viewing page 1.
< Newer [ 1 2 3 4 ] Older >

66. Re: I'm just curio May 29, 2003, 12:39 MindTrigger
 
I have never cared about 3dmark's software, and I'm not going to start caring anytime soon. I've always used the game engine itself to measure results. Tom's hardware always has both in their reviews and I skip right over the "synthetic" benchmarks.

I'm a little bit confused about the whole busines of people paying 3dmark for anything. I mean, isn't their job to be a fair comparison of all the hardware being tested? Why would I want nVidia or ATI (or anyone else) to be able to work WITH 3DMark while creating software meant to COMPARE products? Why would I want them to be able to program drivers specifically for a benchmarking program?

As far as application detection goes, I say BRING IT ON. Make it a freaking checkbox in the driver settings. As my card ages I would LOVE to know their there is a setting like that which will allow me to use engine specific shortcuts which will extend the life of my card. I surely don't want the driver deciding for me, but I would ejoy the option to enable this myself if needed.

I think 3DMark is the real problem here. They are creating an environment where companies either pay them, and reap the benefits of being able to "work directly with them" (whatever that means) or create shortcuts on their own.

No matter what, I will continue NOT caring about the numbers 3Dmark puts out.

This comment was edited on May 29, 12:42.
 
--
He cut the possum's face off then cut around the eye socket. In the center of the belt buckle, where the possum's eye would be, he has placed a small piece of wood from his old '52 Ford's home made railroad tie bumper. Damn, he misses that truck.
Reply Quote Edit Delete Report
 
65. Re: I'm just curio May 25, 2003, 22:27 Jaritsu
 
5800 Ultra.

 
Reply Quote Edit Delete Report
 
64. Re: I'm just curio May 25, 2003, 20:26 SirOvenMitt
 
Is that the 5800 reg or the 5800 Ultra?

 
Reply Quote Edit Delete Report
 
63. Re: I'm just curio May 25, 2003, 12:31 Jaritsu
 
SidVicious:
While not an official benchmark by any means, the guy we know here with a 5800 went from about 170fps flyby to 221fps flyby with the new 44.03 drivers. He was pretty impressed, and so was I.

UT2K3 is about the only game where higher scores are impressive, since in Q3 you get as many fps as you want on a modern system, and most other games dont have a great benchmark built in.

 
Reply Quote Edit Delete Report
 
62. Re: No subject May 25, 2003, 11:41 Von Helmet
 
Botmatch demos in UT2K3 seem to be a much better bet at the moment.

 
Reply Quote Edit Delete Report
 
61. Re: No subject May 25, 2003, 11:30 DrEvil
 
#49 hit the nail right on the head. 3dmark means jack and shit. Actual game performance is what matters, in real games, that people play, and it is all that matters. If you're one of the morons that runs around the web comparing 3dmark scores you need to wake up.

It's obvious futuremark is doing something the industry isnt, because real world game benchmarks show completely different results than 3dmark. Feel free to put all your faith into this shit, you're only hurting yourself.

 
Reply Quote Edit Delete Report
 
60. No subject May 24, 2003, 21:10 kjackie
 
Why didn't Tomshardware, HardOCP, or Anandtech catch this when it first came out?

 
Reply Quote Edit Delete Report
 
59. nVidia Cg May 24, 2003, 18:58 Pr()ZaC
 
C for Cheat

 
Reply Quote Edit Delete Report
 
58. Re: I'm just curious. May 24, 2003, 18:19 Von Helmet
 
OK... so it looks like the new Detonator drivers make the new FX perform brilliantly. At what cost? The cost of image quality.

Now... I've always been a sucker for image quality. I used to play Half Life at about 3 fps just because it looked better at higher resolution with more detail.

These days, you have the choice between... say... the Radeon 9800 Pro and the Geforce FX 5900 (Ultra? Does it have Ultra on the end of this one?). Both of these cards can play UT2K3 (probably the most advanced game these days) at, I don't know, over 100fps at max detail at 1024*768. I run a Radeon 9500 Pro and it pulls in about 50-60 fps at that level of detail, so I'm sure the 9800 Pro and 5900 can stomp all over that and hit the 100s with ease.

So the 9800 Pro runs things at 100fps, and the FX 5900 runs things at 125fps. Does it make any difference? No, none at all. Even with all your arguments about how framerate affects gameplay and clock ticks and so forth, that kind of difference is going to be miniscule when you're already in the 100s.

So the difference? The Ati card produces a better picture, by all accounts. It produces cleaner and crisper graphics. Like I said, I'm a sucker for image quality, so I think it's pretty obvious where my money would be. Assuming I had any. Curses.

Anyway. When both cards are that good anyway, I think it's worth the tiny sacrifice in performance to make it look better.

 
Reply Quote Edit Delete Report
 
57. Re: I'm just curious. May 24, 2003, 16:56 SidVicious
 
Rasguedo: no.

It is not "fudging," IMO. It is deliberate and well-planned fraud. They make a claim to no less than 25% better performance than their competitor, then tell you to pony up $500. I don't think so. This a huge misrepresentation of a product with a high dollar value, no matter who's buying it.

If you think I'm over the top, check the definition for the word "fraud." You may want to reconsider your stance.

And "5-10 fps" isn't quite accurate either. I went to an nVidia presentation on their FX cards recently, and they were pushing ratios even larger than the ones they "fudged" on build 320 of 3DMark. Much larger. Which they expected or hoped us to turn around and parrot to a world full of billions of dollars in potential consumer and business-level revenue.

Yes, it is a business. Yes, they will be cutthroat. No, this behavior is not irrelevant or excusable. It also reveals a chasm of disrespect for the consumer, the benchmark maker, and the tech industry media.

 
Reply Quote Edit Delete Report
 
56. No subject May 24, 2003, 16:25 ExcessDan
 
Oh ya? Well Nvidia has a parnership with EA and also the company making STALKER so there.

"The" Dan
Intel 486SX, Trident video, 8MB RAM, 14" Generic Monitor, 100 MB HDD, Windows 3.11
http://www.ghostmastergame.com
"It's where it's at!" --Dantastic
 
ExcessDan
Reply Quote Edit Delete Report
 
55. Re: ohh ya and May 24, 2003, 16:00 Wowbagger_TIP
 
do you people not understand that the whole damn industry, aside from nvidia's recent bitch ass, work with futuremark?

And if everybody jumped off a bridge...

 
Avatar 9540
 
"The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts." -- Bertrand Russell (I think...)
Reply Quote Edit Delete Report
 
54. ohh ya and May 24, 2003, 15:34 kyleb
 
what is with these bs arguments about ati being "in bed" with futuremark? do you people not understand that the whole damn industry, aside from nvidia's recent bitch ass, work with futuremark?

 
Reply Quote Edit Delete Report
 
53. Re: What's the point of 3dMark2003? May 24, 2003, 15:32 kyleb
 
Dante_uk do you know why ut2003 is faster useing opengl? becuase the opengl renderer is not full featured, and the game is ment to be run in d3d. if you turned of the projecter shadows and whatnot in d3d it would be faster than opengl.

 
Reply Quote Edit Delete Report
 
52. Re: Hahahahaa May 24, 2003, 15:11 LordSteev
 
ATI has a partnership with Valve. The real question is which is more important to you, the gamer?

 
-LordSteev

Supporter of the "Unleash the Fredster!!" Fan Club
Reply Quote Edit Delete Report
 
51. Re: Hahahahaa May 24, 2003, 14:44 Jaritsu
 
or is it ati that is desperate? they pay 100's of thousands of dollars to help develop a program from futuremark which rate's futuremark's investors' products.

Agreed

As far as I'm concerned, ATI can stay in bed with Futuremark and spend all the time they want developing to cater to their needs.

NVidia has a longstanding partnership with id and it shows. ATI has a partnership with Futuremark and it shows.

The real question is which is more important to you, the gamer?

 
Reply Quote Edit Delete Report
 
50. Simple Consideration May 24, 2003, 10:59 Maxx
 
Regardless of your opinion on the 'cheating' or on Futuremark's post-manipulations, the simple fact is that Nvidia outright lied. Re-read the second paragraph in yellow. The rep clearly states that Nvidia is not part of Futuremark's beta program. This is true. However, Nvidia WAS part of the program when 3DMark03 was released! Therefore they DID have a chance to work on writing the shaders. In fact, the whole reason they left the program in the first place was because their new card was getting poor scores on the benchmark. They then turned around and pretty much canned that card, and now brought out a new one and are claiming foul play from Futuremark. To set the record straight, the only entity that's playing games is Nvidia.

This comment was edited on May 24, 11:00.
 
Reply Quote Edit Delete Report
 
49. What's the point of 3dMark2003? May 24, 2003, 09:33 Dante_uk
 
Really, what's the point?

You look at the benchmarks for the following:
Q3 Engine ( at least a dozen licenced games and more coming )
Unreal Engine ( Lots of licenced games and more coming )
DooM 3 ( Doom3 , Quake4, unknown title by HumanHead Studies)
3dMark2003 ( none, nothing, petty tech demo )

Compare top Nvidia card with top ATI card.
Currently:
Nvidia wins in Quake3, Unreal, Doom3
ATI wins in 3dMark2003

Okay now which card do I buy ?

How can those results be fair and true?
None of those game engines are DirectX9, but then only one game test in 3dMArk2003 is actually using any DirectX9.

If the top ATI is better than the top Nvidia card why don't ANY other benchmarks show that?

What's Cheating ?
If there's not code to make these game engines run faster in the drivers from both companies then why the hell not - We all play games using these engines!!
I for one want the best performance and best images I can get.

3dMark2003 claims to be about apples-to-apples comparisons. What's the point?
I Write a DirectX function to spin a cube. It works, it shows me a FPS score. One card runs it faster than another.
Is the faster card going to play quake 3 better and faster then the other card? Would anyone be willing to bet money on which card plays unreal 2 better and faster based on how my spinning cube runs?

Then you have the fact that 3dMark is only DirectX.
Most of the games I play use the quake 3 engine ( opengl ) plus I play IL2 using the OpenGL renderer because it's faster, I play UT2003 using the OpenGL renderer because it's faster.
What should I use to see who has the best OpenGL support on their cards?
Answer: easy, I run Q3 and UT2003 !!

Nvidia is only guilty of wasting time bothering to get a better score with 3dMark2003.
Anyone stupid enough to put their faith in the performance of this program deserves what they get.
This comment was edited on May 24, 10:05.
 
Reply Quote Edit Delete Report
 
48. Re: I'm just curious. May 24, 2003, 08:14 Outlaw
 
Hebrew_national, I purchased a GeForce FX 5600 with 256megs of ram a few weaks ago. It smokes my GeForce 3 in speed and image quality. With the new FX drivers and slightly overclocked I'm running my latest games at 1280x1024 with smooth framerates with drivers set at the quality setting. Some of these games include UT 2003, Vice city, Rainbow Six 3, Vietcong, and Jedi Knight 2. From what I'm seeing this card should do a good job in Doom 3.

This comment was edited on May 24, 10:09.
 
Reply Quote Edit Delete Report
 
47. Re: Just a rumor. May 24, 2003, 04:43 David Johnston
 
Wouldn't happen... QA would pick it up.

 
Reply Quote Edit Delete Report
 
66 Replies. 4 pages. Viewing page 1.
< Newer [ 1 2 3 4 ] Older >


footer

.. .. ..

Blue's News logo