Send News. Want a reply? Read this. More in the FAQ.   News Forum - All Forums - Mobile - PDA - RSS Headlines  RSS Headlines   Twitter  Twitter
Customize
User Settings
Styles:
LAN Parties
Upcoming one-time events:
Chicago, IL 11/17

Regularly scheduled events

Sunday Tech Bits

View
24 Replies. 2 pages. Viewing page 1.
< Newer [ 1 2 ] Older >

24. Re: NVIDIA’s performance improvement per generation has dropped from 70% (pre-2010) to 30% (post-201 Sep 12, 2017, 03:07 yuastnav
 
jdreyer wrote on Sep 10, 2017, 22:12:
LittleMe wrote on Sep 10, 2017, 18:15:
Agreeing with CJ on this. Diminishing returns have impacted the whole industry. I'm confident we'll see many more gains in the future but for now this is how it is.

It will take a quantum leap to see an impressive gain, which is an ironic turn of phrase given that the technological leap is likely to come from quantum computing.

And that a quantum leap is actually a very, very small leap indeed.
 
Reply Quote Edit Delete Report
 
23. Re: RE: The Incredible Growth of Python Sep 11, 2017, 15:38 ViRGE
 
jdreyer wrote on Sep 11, 2017, 12:13:
Jivaro wrote on Sep 11, 2017, 11:09:
My inner 15 year old is laughing at the obvious joke here.
Huh?
I'm assuming "the incredible growth of python".
 
Reply Quote Edit Delete Report
 
22. Re: RE: The Incredible Growth of Python Sep 11, 2017, 12:13 jdreyer
 
Jivaro wrote on Sep 11, 2017, 11:09:
My inner 15 year old is laughing at the obvious joke here.
Huh?
 
Avatar 22024
 
Stay a while, and listen.
Reply Quote Edit Delete Report
 
21. RE: The Incredible Growth of Python Sep 11, 2017, 11:09 Jivaro
 
My inner 15 year old is laughing at the obvious joke here.  
Reply Quote Edit Delete Report
 
20. Re: Sunday Tech Bits Sep 11, 2017, 09:53 Shineyguy
 
So, from 980Ti to 1080Ti isn't the 50% to 60% improvement I thought it was? I mean, that is definitely within a generation. 780Ti to 980Ti looks a little smaller, maybe 30% to 50%. That's still quite a jump.


But expecting to see 70% generation to generation, especially when core counts are around the same, is pretty insane.
 
Reply Quote Edit Delete Report
 
19. Re: NVIDIA’s performance improvement per generation has dropped from 70% (pre-2010) to 30% (post-201 Sep 11, 2017, 06:42 ViRGE
 
LittleMe wrote on Sep 10, 2017, 18:15:
Agreeing with CJ on this. Diminishing returns have impacted the whole industry. I'm confident we'll see many more gains in the future but for now this is how it is.
Agreed with the above.

But also, this article takes an unusual approach to defining generations. To the point where I'm not sure if it's naive or clickbait.

Long story short, they switch the definition of generations mid-way through. e.g. they go from the 7800 GTX to the 8800 GTX, skipping the 7900. But they they go from the GTX 480 to the 580 to the 680.

GTX 480 and GTX 580 use the same Fermi GPU; the latter getting a later version of the design with some errata fixes and other modifications to reduce power leakage. While NVIDIA incremented the version number, it wasn't pitched as a whole new generation any more than the 7900 GTX was. Similarly, GTX 780 is a bigger version of the Kepler GPU used in GTX 680, so it's architecturally the same generation as well (released as a mid-gen refresh, once 28nm yields improved).

If you actually compare architecture generation-to-generation, you'd get something like 580-680-980-1080. Which would still show a decrease in gains as chip complexity goes up and manufacturing gains don't come as frequently. But nothing quite so dramatic. Alternatively, if you did pure year-to-year, you'd indeed see a greater drop-off. But then you'd also have to explain why there's a 0% for 2015.

It's the kind of article that's not outright wrong. But it lacks the clarity and context to explain what's happening and why. "The reason behind this decrease is quite obvious. For starters, AMD cannot keep up with NVIDIA. Due to the lack of competition, NVIDIA does not feel the pressure to release a graphics card that will be way, way better than its previous generation offerings" is very much wrong. It makes the assumption that NVIDIA could go at a significantly faster pace without taking the time to understand why that isn't practical.
 
Reply Quote Edit Delete Report
 
18. Re: Morning Mobilization Sep 11, 2017, 05:13 Tom
 
Comparing raw performance from generation to generation is of little value. What matters much more is efficiency: performance per watt. It's no longer practical to keep cranking up the TDP and relying on ever beefier cooling solutions.

If you compare performance per watt from generation to generation, NVIDIA has been making huge gains.
 
Reply Quote Edit Delete Report
 
17. Re: NVIDIA’s performance improvement per generation has dropped from 70% (pre-2010) to 30% (post-201 Sep 11, 2017, 03:37 Sempai
 
CJ_Parker wrote on Sep 10, 2017, 15:37:
All the big foundries (Intel, GloFo, TSMC etc.) are struggling with real physical limitations on the path to smaller and smaller nodes. It is complete grade A bullshit when some fantards of either camp are claiming that it (the slow progress) all has to do with complacency.

How would that even work? Do Intel engineers get a couple hours off per work day to wank off?

What a silly bunch of crap. They are always moving full steam ahead, testing, iterating, testing again... CPU/GPU development is incredibly complex.
If you look at the financial reports of the big foundries, you can see that their R&D expenses have exploded in recent years. They are trying hard to push on but it's not as easy as ten years ago when "all you had to do" was buy some new equipment to make your fabs ready for a smaller manufacturing process.

There is zero room for complacency. Intel has pretty much lost the whole mobile market to gigs like Qualcomm and Samsung. There is no time for sleep.
nVidia is pushing full steam ahead because they are already seeing huge new avenues in automotive and the supercomputing sector which might spur their growth big time. But they rely on TSMC and... see start of this post... they are struggling just like everyone else due to real technical hurdles that take time to overcome.

Specifically for Intel you can find a pretty good and not overly long article here at Anandtech that explains why it is taking them so long to move to 10nm+. Newsflash: It's not because their engineers are scratching their balls, playing with their dicks or picking their noses all day. Imagine that.

Agreed, the struggle is real. But that certainly hasn't stopped Intel's endless anti-consumer fuckery year in and year out while maintaining their monopoly over the industry.
 
Avatar 33180
 
Reply Quote Edit Delete Report
 
16. Re: Sunday Tech Bits Sep 10, 2017, 22:14 jdreyer
 
Fantaz wrote on Sep 10, 2017, 21:57:
Ozmodan wrote on Sep 10, 2017, 16:23:
Anyone waiting for Nvidia's volta I think is going to be disappointed. I have seen multiple articles saying not to expect more than 10% improvement on the next generation. So spending money on a 1080ti is not going to be bad thing as it's replacement probably won't be that much better.

i heard from insiders that it's actually going to be a much bigger improvement than ever before

They always say that though and it rarely is. I hope you're right though.
 
Avatar 22024
 
Stay a while, and listen.
Reply Quote Edit Delete Report
 
15. Re: NVIDIA’s performance improvement per generation has dropped from 70% (pre-2010) to 30% (post-201 Sep 10, 2017, 22:12 jdreyer
 
LittleMe wrote on Sep 10, 2017, 18:15:
Agreeing with CJ on this. Diminishing returns have impacted the whole industry. I'm confident we'll see many more gains in the future but for now this is how it is.

It will take a quantum leap to see an impressive gain, which is an ironic turn of phrase given that the technological leap is likely to come from quantum computing.
 
Avatar 22024
 
Stay a while, and listen.
Reply Quote Edit Delete Report
 
14. Re: Sunday Tech Bits Sep 10, 2017, 21:57 Fantaz
 
Ozmodan wrote on Sep 10, 2017, 16:23:
Anyone waiting for Nvidia's volta I think is going to be disappointed. I have seen multiple articles saying not to expect more than 10% improvement on the next generation. So spending money on a 1080ti is not going to be bad thing as it's replacement probably won't be that much better.

i heard from insiders that it's actually going to be a much bigger improvement than ever before
 
Avatar 571
 
Reply Quote Edit Delete Report
 
13. Re: Sunday Tech Bits Sep 10, 2017, 20:57 HorrorScope
 
Sure it's slowing down overall, but this last gen was a big jump.  
Avatar 17232
 
Reply Quote Edit Delete Report
 
12. Re: NVIDIA’s performance improvement per generation has dropped from 70% (pre-2010) to 30% (post-201 Sep 10, 2017, 18:15 LittleMe
 
Agreeing with CJ on this. Diminishing returns have impacted the whole industry. I'm confident we'll see many more gains in the future but for now this is how it is.  
Avatar 23321
 
Perpetual debt is slavery.
Reply Quote Edit Delete Report
 
11. Re: Sunday Tech Bits Sep 10, 2017, 17:13 jacobvandy
 
Nintendo Switch is powered by NVIDIA, and that thing's printing money. The revenue it's provided them immediately surpassed every other application of Tegra.  
Reply Quote Edit Delete Report
 
10. Re: NVIDIA’s performance improvement per generation has dropped from 70% (pre-2010) to 30% (post-201 Sep 10, 2017, 17:00 Redmask
 
jdreyer wrote on Sep 10, 2017, 14:54:
The Half Elf wrote on Sep 10, 2017, 14:35:
I'd like to see Microsoft/Sony/Nintendo quit using ATI and switch to Nvidia.

So, you want an Nvidia monopoly?

Nvidia is already just competing with themselves.
 
Avatar 57682
 
Reply Quote Edit Delete Report
 
9. Re: Sunday Tech Bits Sep 10, 2017, 16:35 jdreyer
 
Shorter CJ Parker: Diminishing returns.  
Avatar 22024
 
Stay a while, and listen.
Reply Quote Edit Delete Report
 
8. Re: Sunday Tech Bits Sep 10, 2017, 16:23 Ozmodan
 
Anyone waiting for Nvidia's volta I think is going to be disappointed. I have seen multiple articles saying not to expect more than 10% improvement on the next generation. So spending money on a 1080ti is not going to be bad thing as it's replacement probably won't be that much better.  
Reply Quote Edit Delete Report
 
7. Re: NVIDIA’s performance improvement per generation has dropped from 70% (pre-2010) to 30% (post-201 Sep 10, 2017, 15:37 CJ_Parker
 
All the big foundries (Intel, GloFo, TSMC etc.) are struggling with real physical limitations on the path to smaller and smaller nodes. It is complete grade A bullshit when some fantards of either camp are claiming that it (the slow progress) all has to do with complacency.

How would that even work? Do Intel engineers get a couple hours off per work day to wank off?

What a silly bunch of crap. They are always moving full steam ahead, testing, iterating, testing again... CPU/GPU development is incredibly complex.
If you look at the financial reports of the big foundries, you can see that their R&D expenses have exploded in recent years. They are trying hard to push on but it's not as easy as ten years ago when "all you had to do" was buy some new equipment to make your fabs ready for a smaller manufacturing process.

There is zero room for complacency. Intel has pretty much lost the whole mobile market to gigs like Qualcomm and Samsung. There is no time for sleep.
nVidia is pushing full steam ahead because they are already seeing huge new avenues in automotive and the supercomputing sector which might spur their growth big time. But they rely on TSMC and... see start of this post... they are struggling just like everyone else due to real technical hurdles that take time to overcome.

Specifically for Intel you can find a pretty good and not overly long article here at Anandtech that explains why it is taking them so long to move to 10nm+. Newsflash: It's not because their engineers are scratching their balls, playing with their dicks or picking their noses all day. Imagine that.
 
Reply Quote Edit Delete Report
 
6. Re: NVIDIA’s performance improvement per generation has dropped from 70% (pre-2010) to 30% (post-201 Sep 10, 2017, 15:25 RedEye9
 
jdreyer wrote on Sep 10, 2017, 14:54:
The Half Elf wrote on Sep 10, 2017, 14:35:
I'd like to see Microsoft/Sony/Nintendo quit using ATI and switch to Nvidia.

So, you want an Nvidia monopoly?
Exactly, anyone w/half a brain would know that having Microsoft/Sony/Nintendo use ATI is a win for all.
 
Avatar 58135
 
https://www.newyorker.com/contributors/andy-borowitz
Reply Quote Edit Delete Report
 
5. Re: NVIDIA’s performance improvement per generation has dropped from 70% (pre-2010) to 30% (post-201 Sep 10, 2017, 15:20 Creston
 
Wallshadows wrote on Sep 10, 2017, 14:32:
Nvidia holds such a large share of the market they can pretty much afford to stop trying so hard. Even with the RX series success it's a drop out of their bucket of money. Hell, they're so confident in how bad VEGA is that they're no longer concerned about the release date of Volta.

It's what happens when you're basically racing against yourself. At least Ryzen is forcing Intel's hand at un-shittening their scummy practices from the past six years.

I'm sure that's part of it, but isn't it also just basically Moore's Law? 70% performance is a humongous jump, would that really be sustainable indefinitely? (Note: Not a hardware engineer at all.)
 
Avatar 15604
 
Reply Quote Edit Delete Report
 
24 Replies. 2 pages. Viewing page 1.
< Newer [ 1 2 ] Older >




Blue's News is a participant in Amazon Associates programs
and earns advertising fees by linking to Amazon.



footer

Blue's News logo