Send News. Want a reply? Read this. More in the FAQ.   News Forum - All Forums - Mobile - PDA - RSS Headlines  RSS Headlines   Twitter  Twitter
Customize
User Settings
Styles:
LAN Parties
Upcoming one-time events:

Regularly scheduled events

GeForce GTX TITAN Announced

NVIDIA announces the GeForce GTX TITAN, saying their new graphics card can make your PC a "gaming supercomputer" in the vein of The World's #1 Open Science Supercomputer. Word is: "Despite being ten times more powerful than its predecessor, the Titan supercomputer takes up the same amount of space and uses the same amount of power. Only GTX TITAN brings this rare combination of raw power and incredible efficiency to PC gaming." There are p/reviews of the new card on AnandTech, Guru3D, Hardware Canucks, Hardware Heaven, Legit Reviews, and Overclockers Club.

View
78 Replies. 4 pages. Viewing page 1.
< Newer [ 1 2 3 4 ] Older >

78. Re: GeForce GTX TITAN Announced Feb 21, 2013, 07:19 dj LiTh
 
eRe4s3r wrote on Feb 21, 2013, 01:14:
The only thing I "told" is that GPU's , if you take them as a whole, are 40 times faster than the CPU's we have now. That is not a small red orange, that is reality*

*In certain unix performance benchmarks

I am just confused why some people think CPU's are fast... Talk to an AI or Physics coder what he thinks about the current state of CPU development.

Well since you like comparing things that have nothing to do with each other, lets keep on that note, the CPU has lower latency than a GPU, the CPU has a higher throughput than a GPU. Its easy to cherry pick when your comparing two completely different animals.

 
Avatar 46370
 
Reply Quote Edit Delete Report
 
77. Re: GeForce GTX TITAN Announced Feb 21, 2013, 01:14 eRe4s3r
 
The only thing I "told" is that GPU's , if you take them as a whole, are 40 times faster than the CPU's we have now. That is not a small red orange, that is reality*

*In certain unix performance benchmarks

I am just confused why some people think CPU's are fast... Talk to an AI or Physics coder what he thinks about the current state of CPU development.
 
Avatar 54727
 
Reply Quote Edit Delete Report
 
76. Re: GeForce GTX TITAN Announced Feb 20, 2013, 18:22 AngelicPenguin
 
eRe4s3r wrote on Feb 20, 2013, 12:05:
dj LiTh wrote on Feb 20, 2013, 11:07:
...Your statement is completely false...

Wha... how can you not get what I mean, do we speak different languages here?

I THINK traditional CPU CORES ARE TOO SLOW compared to modern GPU's and their many-core approach.

I ALSO THINK THEY SHOULD BE FASTER IN GENERAL by factor 5 to 40 per CORE to be USEFUL for futuristic stuff. Like AI, Physics and proper.. AI.

Meh!

If you want slower cpu's, good for you. I want faster cpu's.

You made quite a few posts trying to tell us how an apple is a small red orange.
 
Avatar 55985
 
Reply Quote Edit Delete Report
 
75. Re: GeForce GTX TITAN Announced Feb 20, 2013, 12:22 Verno
 
The comparison isn't really apt is the point, which I thought he made pretty clear. A GPU can't do every task a CPU can and you can't solve every problem with parallel processing. There are trade offs with everything, it's like demanding ARM chips put out the performance of x86 when there are fundamental architecture differences for other considerations like power usage.  
Avatar 51617
 
Playing: Dragon Age Inquisition, Far Cry 4, This War of Mine
Watching: The Walking Dead, The Fall, As Above So Below
Reply Quote Edit Delete Report
 
74. Re: GeForce GTX TITAN Announced Feb 20, 2013, 12:05 eRe4s3r
 
dj LiTh wrote on Feb 20, 2013, 11:07:
...Your statement is completely false...

Wha... how can you not get what I mean, do we speak different languages here?

I THINK traditional CPU CORES ARE TOO SLOW compared to modern GPU's and their many-core approach.

I ALSO THINK THEY SHOULD BE FASTER IN GENERAL by factor 5 to 40 per CORE to be USEFUL for futuristic stuff. Like AI, Physics and proper.. AI.

Meh!

If you want slower cpu's, good for you. I want faster cpu's.
 
Avatar 54727
 
Reply Quote Edit Delete Report
 
73. Re: GeForce GTX TITAN Announced Feb 20, 2013, 11:07 dj LiTh
 
wtf_man wrote on Feb 20, 2013, 09:07:
dj LiTh wrote on Feb 20, 2013, 08:26:
Fourth GPU's cant take on complex calculations, only simple ones unlike CPU's.

Uhh... not sure where you are getting that one... I'm pretty sure that GPUs are much more efficient at floating point arithmetic than CPUs. How many cores it takes to get that efficiency, I don't know. I do know that they wouldn't be using these cards for supercomputing if they couldn't do complex calculations.

CISC vs RISC
 
Avatar 46370
 
Reply Quote Edit Delete Report
 
72. Re: GeForce GTX TITAN Announced Feb 20, 2013, 11:07 dj LiTh
 
eRe4s3r wrote on Feb 20, 2013, 10:09:
dj LiTh wrote on Feb 20, 2013, 08:26:
eRe4s3r wrote on Feb 20, 2013, 06:30:
Hen and Egg problem. If CPU's weren't slow pieces of crap compared to the GPU's we would have gameplay evolving, and not just stagnating like it does now. Currently RTS games (TOTAL WAR) can't have individual unit AI on par because there is no CPU on the planet that can handle even 10k units with good AI (and their pathfinding/Collision detection). And think about why there is so little procedural deformation in physics. If your physics engine is affecting collision it needs to be calculated entirely by the CPU.

And don't even get me started on proper neural networks and real-time raytracing. Everything is held back by the CPU and the stupid lets keep everything to a single core mandate that next gen consoles are hopefully going to break.

To give an idea how insane this performance issue is

Intel I7 = 109 gigaFLOPS (or 150 gigaflops if you push all 6 cores to the limit) Yes, that means 150 / 6 is the actual performance of your cpu in most games, because what's grinding games to a halt is when a single thread on a single core is cpu starved.

GTX TITAN = 4500 gigaFLOPS (according to the PR stuff floating around)

Or put differently, a GTX Titan is 40 times faster than the fastest CPU your money can buy. Even if there is a 50% variance, gpu's are still ahead of cpu's by nearly a decade.

At this point we should be running 1024 core cpu's

Wow, just so much wrong with that post. Where to begin... First off why dont you compare 1 cpu core to 1 gpu core if you think cpu's are so slow? Second, GPU cores are much slower than CPU cores on a thread basis. Third GPU's lack interrupts and virtual memory which is pretty usefull to OS's. Fourth GPU's cant take on complex calculations, only simple ones unlike CPU's. Fifth Tasks that cannot be paralleled your pretty much SOL on a GPU. Sixth Its fairly easy to write an x86 compatible program, and very complex to write one that takes advantage of the GPU's parallel processing.

Those are just off the top of my head.

And I never implied otherwise. My point was merely that CPU's are way too friggin slow ^^ Maybe I should have said that clearer? But I think CPU's are about 5 to 10 times too slow (per core) and that is why everything stagnates. Sure an I7 is super fancy, but it's single core performance has reached a ceiling. And this ceiling is what limits games. As you say, some things can't be multi-threaded. Single core cpu power should be 150gflops, not 25!

Your comparing 1/2 or 4 cores vs hundreds vs a GPU. Your statement is completely false, a single cpu core is much faster than a single gpu core.
 
Avatar 46370
 
Reply Quote Edit Delete Report
 
71. Re: GeForce GTX TITAN Announced Feb 20, 2013, 10:09 eRe4s3r
 
dj LiTh wrote on Feb 20, 2013, 08:26:
eRe4s3r wrote on Feb 20, 2013, 06:30:
Hen and Egg problem. If CPU's weren't slow pieces of crap compared to the GPU's we would have gameplay evolving, and not just stagnating like it does now. Currently RTS games (TOTAL WAR) can't have individual unit AI on par because there is no CPU on the planet that can handle even 10k units with good AI (and their pathfinding/Collision detection). And think about why there is so little procedural deformation in physics. If your physics engine is affecting collision it needs to be calculated entirely by the CPU.

And don't even get me started on proper neural networks and real-time raytracing. Everything is held back by the CPU and the stupid lets keep everything to a single core mandate that next gen consoles are hopefully going to break.

To give an idea how insane this performance issue is

Intel I7 = 109 gigaFLOPS (or 150 gigaflops if you push all 6 cores to the limit) Yes, that means 150 / 6 is the actual performance of your cpu in most games, because what's grinding games to a halt is when a single thread on a single core is cpu starved.

GTX TITAN = 4500 gigaFLOPS (according to the PR stuff floating around)

Or put differently, a GTX Titan is 40 times faster than the fastest CPU your money can buy. Even if there is a 50% variance, gpu's are still ahead of cpu's by nearly a decade.

At this point we should be running 1024 core cpu's

Wow, just so much wrong with that post. Where to begin... First off why dont you compare 1 cpu core to 1 gpu core if you think cpu's are so slow? Second, GPU cores are much slower than CPU cores on a thread basis. Third GPU's lack interrupts and virtual memory which is pretty usefull to OS's. Fourth GPU's cant take on complex calculations, only simple ones unlike CPU's. Fifth Tasks that cannot be paralleled your pretty much SOL on a GPU. Sixth Its fairly easy to write an x86 compatible program, and very complex to write one that takes advantage of the GPU's parallel processing.

Those are just off the top of my head.

And I never implied otherwise. My point was merely that CPU's are way too friggin slow ^^ Maybe I should have said that clearer? But I think CPU's are about 5 to 10 times too slow (per core) and that is why everything stagnates. Sure an I7 is super fancy, but it's single core performance has reached a ceiling. And this ceiling is what limits games. As you say, some things can't be multi-threaded. Single core cpu power should be 150gflops, not 25!
 
Avatar 54727
 
Reply Quote Edit Delete Report
 
70. Re: GeForce GTX TITAN Announced Feb 20, 2013, 09:07 wtf_man
 
dj LiTh wrote on Feb 20, 2013, 08:26:
Fourth GPU's cant take on complex calculations, only simple ones unlike CPU's.

Uhh... not sure where you are getting that one... I'm pretty sure that GPUs are much more efficient at floating point arithmetic than CPUs. How many cores it takes to get that efficiency, I don't know. I do know that they wouldn't be using these cards for supercomputing if they couldn't do complex calculations.
 
Avatar 19499
 
Reply Quote Edit Delete Report
 
69. Re: GeForce GTX TITAN Announced Feb 20, 2013, 08:26 dj LiTh
 
eRe4s3r wrote on Feb 20, 2013, 06:30:
Hen and Egg problem. If CPU's weren't slow pieces of crap compared to the GPU's we would have gameplay evolving, and not just stagnating like it does now. Currently RTS games (TOTAL WAR) can't have individual unit AI on par because there is no CPU on the planet that can handle even 10k units with good AI (and their pathfinding/Collision detection). And think about why there is so little procedural deformation in physics. If your physics engine is affecting collision it needs to be calculated entirely by the CPU.

And don't even get me started on proper neural networks and real-time raytracing. Everything is held back by the CPU and the stupid lets keep everything to a single core mandate that next gen consoles are hopefully going to break.

To give an idea how insane this performance issue is

Intel I7 = 109 gigaFLOPS (or 150 gigaflops if you push all 6 cores to the limit) Yes, that means 150 / 6 is the actual performance of your cpu in most games, because what's grinding games to a halt is when a single thread on a single core is cpu starved.

GTX TITAN = 4500 gigaFLOPS (according to the PR stuff floating around)

Or put differently, a GTX Titan is 40 times faster than the fastest CPU your money can buy. Even if there is a 50% variance, gpu's are still ahead of cpu's by nearly a decade.

At this point we should be running 1024 core cpu's

Wow, just so much wrong with that post. Where to begin... First off why dont you compare 1 cpu core to 1 gpu core if you think cpu's are so slow? Second, GPU cores are much slower than CPU cores on a thread basis. Third GPU's lack interrupts and virtual memory which is pretty usefull to OS's. Fourth GPU's cant take on complex calculations, only simple ones unlike CPU's. Fifth Tasks that cannot be paralleled your pretty much SOL on a GPU. Sixth Its fairly easy to write an x86 compatible program, and very complex to write one that takes advantage of the GPU's parallel processing.

Those are just off the top of my head.
 
Avatar 46370
 
Reply Quote Edit Delete Report
 
68. Re: GeForce GTX TITAN Announced Feb 20, 2013, 06:30 eRe4s3r
 
Hen and Egg problem. If CPU's weren't slow pieces of crap compared to the GPU's we would have gameplay evolving, and not just stagnating like it does now. Currently RTS games (TOTAL WAR) can't have individual unit AI on par because there is no CPU on the planet that can handle even 10k units with good AI (and their pathfinding/Collision detection). And think about why there is so little procedural deformation in physics. If your physics engine is affecting collision it needs to be calculated entirely by the CPU.

And don't even get me started on proper neural networks and real-time raytracing. Everything is held back by the CPU and the stupid lets keep everything to a single core mandate that next gen consoles are hopefully going to break.

To give an idea how insane this performance issue is

Intel I7 = 109 gigaFLOPS (or 150 gigaflops if you push all 6 cores to the limit) Yes, that means 150 / 6 is the actual performance of your cpu in most games, because what's grinding games to a halt is when a single thread on a single core is cpu starved.

GTX TITAN = 4500 gigaFLOPS (according to the PR stuff floating around)

Or put differently, a GTX Titan is 40 times faster than the fastest CPU your money can buy. Even if there is a 50% variance, gpu's are still ahead of cpu's by nearly a decade.

At this point we should be running 1024 core cpu's

This comment was edited on Feb 20, 2013, 06:40.
 
Avatar 54727
 
Reply Quote Edit Delete Report
 
67. Re: GeForce GTX TITAN Announced Feb 20, 2013, 05:47 kyleb
 
Many games don't require a particularly fast CPU to keep up with a 60hz or even a 120hz refresh rate, they just need a lot of GPU power to do so at very high resolutions with lots of anti-aliasing.  
Reply Quote Edit Delete Report
 
66. Re: GeForce GTX TITAN Announced Feb 20, 2013, 05:32 eRe4s3r
 
Even more hilarious, this card is 100% cpu starved. And as I keep saying, what's holding gaming back (AI) is the CPU , not the GPU.

I want a CPU with 4.5 tflop/s and then we can talk.
 
Avatar 54727
 
Reply Quote Edit Delete Report
 
65. Re: GeForce GTX TITAN Announced Feb 20, 2013, 02:25 Qbex .
 
"...overkill..." - nonsense pumpkin, 27" 1440p korean monitor here, single GTX 680 is struggling with most of demanding games with IQ cranked up. I had to stop using AA to keep high frame rates and I hate jaggies. It's more pants as my monitor is overclocked to 90 Hz so hitting 90 fps is even harder. But back on topic - Titan is a yawner for a gamer. It has gpu-compute grunt more that anything and silly price on top, in games will end up ~ 30%- faster that gtx 680. I was waiting to get it but after reading what's what I'd rather get second 680 and keep the change.

This comment was edited on Feb 20, 2013, 02:45.
 
Avatar 57520
 
Reply Quote Edit Delete Report
 
64. Re: GeForce GTX TITAN Announced Feb 19, 2013, 19:52 Redmask
 
Mashiki Amiketo wrote on Feb 19, 2013, 18:31:
dj LiTh wrote on Feb 19, 2013, 15:27:
So you remember people saying SLI has come along way the year it came out?
Yep, because if you remember 3dfx had been pumping tech demo's for nearly a year by that time.

Come off it, you obviously misspoke and meant in terms of performance relative to the public.

SLI has come a long way since its inception and the driver problems of the past three years.
 
Avatar 57682
 
Reply Quote Edit Delete Report
 
63. Re: GeForce GTX TITAN Announced Feb 19, 2013, 18:31 Mashiki Amiketo
 
dj LiTh wrote on Feb 19, 2013, 15:27:
So you remember people saying SLI has come along way the year it came out?
Yep, because if you remember 3dfx had been pumping tech demo's for nearly a year by that time.
 
--
"For every human problem,
there is a neat, simple solution;
and it is always wrong."
--H.L. Mencken
Reply Quote Edit Delete Report
 
62. Re: GeForce GTX TITAN Announced Feb 19, 2013, 18:18 AngelicPenguin
 
Asmo wrote on Feb 19, 2013, 16:49:
AngelicPenguin wrote on Feb 19, 2013, 11:17:
Shataan wrote on Feb 19, 2013, 10:16:
Wtf is the point of having this kinda power if most Developers don`t push the vis envelope in their games.

Multi-screen gaming.

Not required, I run 6080x1200 (3x 24" 1920x1200) on a single GTX680 4 gig (the extra RAM makes the big difference with the extra displays, a 2 gig has issues with the textures), typically most games can run med-high no sweat. Far Cry 3 on high ran solidly, although I'd say envelope pushing with Crysis 3 might give it a heart attack... =)

Ya, I was just pointing out that in multi-screen gaming you really start to need this sort of power if you want to jack the settings up. I run dual 7970s in 3 24" screens and there are many games I can't play on the max settings across all three. Then I have to compromise, either lower the settings or play on the center screen.

I think these cards are made for people who don't want to compromise and obviously like to spend their extra cash on their gaming hobby.
 
Avatar 55985
 
Reply Quote Edit Delete Report
 
61. Re: GeForce GTX TITAN Announced Feb 19, 2013, 18:12 AngelicPenguin
 
Verno wrote on Feb 19, 2013, 15:03:
AngelicPenguin wrote on Feb 19, 2013, 14:51:
While it pains me greatly to say so, I actually have to agree with loony (sic.) A single 7970 at 2560 resolution would not maintain 60fps on the highest settings in any of the examples he listed.

I never claimed it would, I'm just saying you don't need to run out and buy a $1000+ quasi-commercial card to solve that "problem" necessarily but hey if thats how someone wants to spend their money *shrug*. You can't simply set a framerate target in stone either, different engines are optimized for different targets. The original Crysis is a great example of that. My personal philosophy is to shoot for reasonable averages at what I consider playable at the highest settings.

I understand, but he is obviously (abrasively) coming from a point of view where he wants to play his games in the highest settings at the highest single monitor resolution you can without sacrificing frame rate. And this scenario will require two $500 cards (or I suppose this card, which I don't know anything about.)

Obviously no one needs any of this stuff, but he is arguing against people who say with a single monitor this is overkill, which for that criteria it is not.
 
Avatar 55985
 
Reply Quote Edit Delete Report
 
60. Re: GeForce GTX TITAN Announced Feb 19, 2013, 18:02 Julio
 
Cutter wrote on Feb 19, 2013, 16:46:
wtf_man wrote on Feb 19, 2013, 15:25:
loomy wrote on Feb 19, 2013, 13:36:
wrong motherfuckers.

Since that is plural, I guess your mom got passed around alot.

What can I say, Mrs. Cartman was really drunk the night of the barn dance. I'm sure as hell not his dad though!

He can call anyone dad and have a small chance of being right

Given HOMM3 is the pinnacle of gaming goodness, I don't need this new card. The games aren't there that need this level of speed.
 
Reply Quote Edit Delete Report
 
59. Re: GeForce GTX TITAN Announced Feb 19, 2013, 17:51 4D-Boxing
 
AngelicPenguin wrote on Feb 19, 2013, 11:17:
Shataan wrote on Feb 19, 2013, 10:16:
Wtf is the point of having this kinda power if most Developers don`t push the vis envelope in their games.

Multi-screen gaming.

Or even just 2560x1600.
 
Reply Quote Edit Delete Report
 
78 Replies. 4 pages. Viewing page 1.
< Newer [ 1 2 3 4 ] Older >


footer

Blue's News logo