GeForce 6 Announced

NVIDIA Launches GeForce 6 Series - Biggest Performance and Feature Leap in Company History is the press release with the announcement that accompanies the spate of previews in this morning's Tech Bits. The release offers technical info as well as the usual use of phrases like "Industry-leading" and "revolutionary" in touting what they call "the biggest generation-to-generation performance leap that we have ever seen with a new GPU." Word is: "Retail graphics boards based on the GeForce 6800 models are slated for release in the next 45 days." The release also quotes industry luminaries John Carmack and Tim Sweeney on the new hardware:
"As DOOM 3 development winds to a close, my work has turned to development of the next generation rendering technology. The NV40 is my platform of choice due to its support of very long fragment programs, generalized floating point blending and filtering, and the extremely high performance," said John Carmack, president and technical director of id Software
    And:
"The NVIDIA GeForce 6 Series is a great leap forward in PC graphics, bringing next-generation Microsoft DirectX 9 rendering performance to new heights while continuing forward with the high-definition floating-point rendering, driver performance, and stability NVIDIA is famous for," stated Tim Sweeney, Founder EPIC Games. "As the first GPU supporting the new Pixel Shader 3.0 programming model, the GeForce 6 Series enables games to use entirely new rendering approaches and obtain higher performance per-pixel lighting, shadowing, and special effects than previously possible."
More on this can be found in: NVIDIA GeForce 6800 Models of GPUs to Be Offered by Record Number of Partners and Game Creators and Publishers Praise NVIDIA GeForce 6 Series. Thanks Frans.
View : : :
147 Replies. 8 pages. Viewing page 1.
Newer [  1  2  3  4  5  6  7  8  ] Older
147.
 
Nvidia is Teh win
Apr 19, 2004, 21:52
Nvidia is Teh win Apr 19, 2004, 21:52
Apr 19, 2004, 21:52
 
Well with this move and the optimizations found on the nforce3 250gb Nvidia is definitly in the lead again. Unfortunatly the 6800 raises the price point and the motherboard optimizations are a little below the belt. Anyway bravo! This battle is definitly still raging despite all projections!

146.
 
Re: quantum!?
Apr 18, 2004, 04:45
Re: quantum!? Apr 18, 2004, 04:45
Apr 18, 2004, 04:45
 
There's one critical flaw with that post: soon



145.
 
quantum!?
Apr 17, 2004, 21:48
quantum!? Apr 17, 2004, 21:48
Apr 17, 2004, 21:48
 
Quantum computing GPUs will change everything. Soon the whole world will have computers that control everything. We can call it SkyNet! Computers rulex0r!

144.
 
Re: I wonder
Apr 17, 2004, 06:05
Re: I wonder Apr 17, 2004, 06:05
Apr 17, 2004, 06:05
 
Looks like ATI are doing a nvidia.ATI engineer 1:Arrgghhh the nv40 has 16 piplines and does ps3.0 and balanves your checkbook for you,and gets 12500 stock in 3D mark 03 what are we going to do?
ATI engineer 2:um,we`ll whack 4 more pipelines onto the 420,wind the clockspeeds up to melting point,bolt on a leafblower and tell everyone that 3D Mark 03 is not a true representation of performance

There is a reson the clockspeed on the R420 is so insanely fast.It`s cos ATI are shitting them selves:)

This comment was edited on Apr 17, 06:08.
143.
 
Re: Re: Yo
Apr 16, 2004, 15:18
Re: Re: Yo Apr 16, 2004, 15:18
Apr 16, 2004, 15:18
 
512mb gfx card ram overkill... hm, no, the way games are going even now that's not overkill

What are you smoking? The best video cards will always outpace the latest video games, and now is no different. Considering the 9800xt can whip anything currently thrown at it, I don't see how you can think that 512mb ram on graphics card is not overkill.
Paranoid is absolutely right. Ram gains from 128 to 256 are totally negligable. 512mb on a video card is a colossal waste of money. Hell, I have 512mb of system ram and 9700np and i can run far cry at 1280x1024 without a problem. Your statement is unfounded.

142.
 
Re: Yakumo
Apr 16, 2004, 09:46
Re: Yakumo Apr 16, 2004, 09:46
Apr 16, 2004, 09:46
 
If you check the game benchmarks with the extra 128 MB of RAM the 256 MB cards are getting on average less than a 10% (closer to 5% in some games) increase in frame rate and only on the highest resolutions. To me that does not warrant the purchase of a card that is anywhere from $100 to $200 higher in price. And that is why I say it is over kill. The CPU is still the bottle neck when using one of those cards. You may be right and things may begin to change but over the past year and a half it has been a waste of money and overkill to purchase a 256 MB card, IMO.

This comment was edited on Apr 16, 09:50.
Avatar 11537
141.
 
Re: Bla
Apr 16, 2004, 07:24
Re: Bla Apr 16, 2004, 07:24
Apr 16, 2004, 07:24
 
even Doom 3 won't use the full dx9 feature-set AFAIK but please feel free to correct me on this one.

It's OpenGL, so won't use any of it.

Avatar 18712
140.
 
Re: Yo
Apr 16, 2004, 06:10
Re: Yo Apr 16, 2004, 06:10
Apr 16, 2004, 06:10
 
512mb gfx card ram overkill... hm, no, the way games are going even now that's not overkill, if you look at things like ut2k4 you'd probably find something like 60% of that 2gig of crap it dumps on your HD is texture data (guessing , don't have the game), and with shaders becoming that much more complex, and every surface wanting maybe a texture map+bump map+difuse map+displacement map and god knows what else, you need all that data stored in the fastest ram possible for the card to deal with it all.

This comment was edited on Apr 16, 06:10.
139.
 
Re: Time has come
Apr 16, 2004, 06:05
Re: Time has come Apr 16, 2004, 06:05
Apr 16, 2004, 06:05
 
I thoghut NVIDIA's bridging was just for the PCIe chip on a card designed for an AGP slot?

ie, no bridge on the PCIe card

138.
 
Re: Time has come
Apr 16, 2004, 05:55
Re: Time has come Apr 16, 2004, 05:55
Apr 16, 2004, 05:55
 
Paranoid, I wasn't talking about an effect mod, or basic soft shadows (which can be done in dx8 anyway) or something that's fundamentally ineffectual to gameplay; I'm talking about a renderer built from the ground up for PS3, DX9, OGL2 etc. The best example of that is Unreal Engine 3 (as Tim Sweeney calls it) which is more than two years away.

As for PCI Express; it's not going to bring any instant benefits, you won't be disappointed by it if you don't expect it to do things it wasn't designed to do. Ati and NVidia's new cards will have PCIe versions (NVidia's using a PCI->AGP bridge, Ati's with native PCIe support); but they're more for conveniance for those who buy PCIe enabled motherboards. PCIe is there to remove the limits of AGP that graphics cards are reaching, not give an instant benefit; the benefits will come gradually over time as those limits are surpassed.

137.
 
Time has come
Apr 16, 2004, 05:12
Time has come Apr 16, 2004, 05:12
Apr 16, 2004, 05:12
 
Time has come to se the start of a new battle.
Lets see if Nvidia can win this time. Ati would launch its new card in 10 days.
Interesting time to read reviews ...

(RDS)

136.
 
Re: Yo
Apr 15, 2004, 21:02
Re: Yo Apr 15, 2004, 21:02
Apr 15, 2004, 21:02
 
I should have been more specific. I realize D3 is OpenGL. It's mainly Stalker, which I want to see with the new Shaders, that's what I meant. I am dying to see all of them on a new 6800. Not the Ultra but a regular 6800. Believe it or not they will have a version with 512 MB of DDR3 coming out soon. Talk about overkill, even 256 MB is overkill.

I know what D3 & HL2 will look like. Though I imagine HL2 will benefit from the new Shaders as well. Since they are able to make the soft shadows much more life like. That should add to the emersion.

But neither of those games has me as excited as Stalker. I was hoping to see it at the release party but no such luck.

The Battle For Middle Earth (RTS) was very impressive and the modded FarCry was too. On the other hand EQ2 was amazing looking but had a whole lot of clipping going on and the little demo shown has me wondering how well the game play mechanics will work. But as far as the creatures and character models... they were stunning. It's due out in the fall so they have time to tweak/fix it.

The creatures shown using Unreal 3 (not Unreal Engine 3, but Unreal 3, that's how it was labeled at the show) looked absolutely amazing. The largest creature has a higher polygon count then an entire level from UT, released in 2001. You could see the pores and spots on its skin. And like the mermaid who's hair is actually thousands of fibers which lets light pass through on the ends makes her look much more life like.

I can't wait to see ATI's response and their new card. I am so very happy I waited this extra couple of months to buy a new graphics card. Otherwise I would be using a 9800 Pro and have paid >$300 for it. I may still get one but at a much more reasonable price now that the next gen cards are looming.


This comment was edited on Apr 15, 21:07.
Avatar 11537
135.
 
No subject
Apr 15, 2004, 20:59
No subject Apr 15, 2004, 20:59
Apr 15, 2004, 20:59
 
This is going to be my year to upgrade so I am acutely interested. But what about PCI express. What, if anybody here knows, will be the advantage of a PCI express card over it's AGP equivalent? Is it going to be a big letdown like AGP2 to AGP 4 to AGP 8 was? Can ATI match Nvidia performance wise? ATI is more aggressive when it comes to pricing and from the speculation on their new series of cards it sounds like there may be an extreme likelihood of some more soft-modding bargains. And it looks like Nvidia is back in the game. It's gonna be interesting!

134.
 
Re: Yo
Apr 15, 2004, 15:43
Re: Yo Apr 15, 2004, 15:43
Apr 15, 2004, 15:43
 
"So I can't wait to see D3, HL2, and Stalker with Shader 3.0. Going to be a good year for gamers"
Pixel Shader 2.0, 3.0 are DX9 functions. You will never see them used in OpenGl games. Never ever ever. Doom 3 doesn't even make use of the most advanced shaders available for our 9700s. Doom 3 is coded in a magic language called OpenGL. No shader 3.0. no Shader 3.0
No shader 3.0 for doom 3.0


3.0

The companies will be very pleased.
STAY RIGHT WHERE YOU ARE, GET OUT OF THAT BED AND GET DOWN ON THE FLOOR, GET OUTSIDE RIGHT NOW, RIGHT HERE: GET DOWN ON THE CEMENT, I DONT CARE IF YOU'RE NUDE, GET DOWN ON THE CEMENT, I DON'T CARE IF ITS FREEZING! WHERES THE DRUGS, WE KNOW YOU GOT THE DRU
133.
 
Re: Yo
Apr 15, 2004, 14:44
Re: Yo Apr 15, 2004, 14:44
Apr 15, 2004, 14:44
 
So I can't wait to see D3, HL2, and Stalker with Shader 3.0. Going to be a good year for gamers.

Who says D3, HL2, or Stalker are actually coming out this year?? Haha.

132.
 
Re: I wonder
Apr 15, 2004, 13:25
Re: I wonder Apr 15, 2004, 13:25
Apr 15, 2004, 13:25
 
My big beef is that Nvidia has screwed the small form factor guys with 2 generations of boards now. Many of us LANers use the shuttles and again can not use the new the nvidia card because of size limitations...I guess ATI will keep winning the small form factor business...Nvidia better take notice...small is the "new" thing!

131.
 
Yo
Apr 15, 2004, 10:10
Yo Apr 15, 2004, 10:10
Apr 15, 2004, 10:10
 
"Personally I'm going to take this opportunity not to upgrade at all (for the first time). These cards will be old news long before games will use their capabilities; I'm not paying £400+ for some big numbers in benchmarks."

Wrong, Farcar was already released and/or finished then it took the developers "three weeks" to make a "mod" that takes advantage of the 3.0 Shaders. They made an amazing looking games engine look even more amazing. The CEO/Prez (can't remember his name)of Crytek was there at the release party.

I own an ATI card and have skipped two generations so I have been waiting for these next gen cards to be released. So now I am left waiting for ATI's next big thing before I buy. But I hope to god they have Shader 3.0 capabilities because if not they just got bitch slapped. The new shaders made FarCry that much more impressive. So I can't wait to see D3, HL2, and Stalker with Shader 3.0. Going to be a good year for gamers.


Avatar 11537
130.
 
Re: Bla
Apr 15, 2004, 09:44
Re: Bla Apr 15, 2004, 09:44
Apr 15, 2004, 09:44
 
Get it straight guys: OpenGl does not make use of dx9


The companies will be very pleased.
STAY RIGHT WHERE YOU ARE, GET OUT OF THAT BED AND GET DOWN ON THE FLOOR, GET OUTSIDE RIGHT NOW, RIGHT HERE: GET DOWN ON THE CEMENT, I DONT CARE IF YOU'RE NUDE, GET DOWN ON THE CEMENT, I DON'T CARE IF ITS FREEZING! WHERES THE DRUGS, WE KNOW YOU GOT THE DRU
129.
 
Re: I wonder
Apr 15, 2004, 09:14
Re: I wonder Apr 15, 2004, 09:14
Apr 15, 2004, 09:14
 
Still think carmack made a wise choice that had nothing to do with money?

Since we don't have benchmarks for the ATI card yet, that's rather hard to say now, isn't it? Higher clock and memory speed do not necessarily mean a faster card. There are rumors flying about that the shaders on the R420 chipsets aren't as complete as those on the NV40. But, once again, that's just rumor.

As for Carmack taking money for this -- it may be, but I doubt it. He's recommended many different companies in the past, based purely on technical performance.

It really is sad to see the fanboies come out of the woodwork on these threads -- both for ATI and nVidia. They're just freaking companies. Your purchase of their card does not mean that you need to defend them with your honor. Both companies have had leading products in the past, both companies have had sucky products in the past, and both companies have totally screwed their customer base in the past. Trust me... they have no loyalty to you. Buy whatever works best for the best price when you need/want a new card. Anything else is just silly.

128.
 
Re: Unreal 3???
Apr 15, 2004, 07:30
Beamer
 
Re: Unreal 3??? Apr 15, 2004, 07:30
Apr 15, 2004, 07:30
 Beamer
 
It wasn't choppy at all for me.. maybe its your player?

Nah, watch it again, when the game camera goes under the arch. The first couple outside minutes, especially there, seem to chop, then it smooths up inside. Of course, it's full detail, new drivers and a rough engine, so it's no big deal.

147 Replies. 8 pages. Viewing page 1.
Newer [  1  2  3  4  5  6  7  8  ] Older