Send News. Want a reply? Read this. More in the FAQ.   News Forum - All Forums - Mobile - PDA - RSS Headlines  RSS Headlines   Twitter  Twitter
Customize
User Settings
Styles:

Saturday Tech Bits

View
27 Replies. 2 pages. Viewing page 1.
< Newer [ 1 2 ] Older >

27. Re: Saturday Tech Bits Aug 19, 2018, 08:37 El Pit
 
Let's wait for benchmarks from trusted websites. Then I will decide if I upgrade from my GTX1070. I am on WQHD, so I am not really hurting for more fps right now. But if Turing can deliver and is not overpriced, I might bite.  
They're waiting for you, Gabe, in the test chamber!
Reply Quote Edit Delete Report
 
26. Re: Saturday Tech Bits Aug 19, 2018, 01:23 Slick
 
Also, fuck Shadowplay that hasn't worked for about a year for me. I reinstalled everything but Win10 itself. Constantly broken. I assume AMD has a similar feature that might actually work.  
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
25. Re: Saturday Tech Bits Aug 19, 2018, 01:15 Slick
 
Those huge leaps in VR never amounted to anything, all of their multi-projection tech (which was actually cool) to my knowledge never came out the way it was advertised. Been 2 years and bubkiss. Expecting about the same for RT. Waste of silicon, I'd rather get more base game performance.

This has been the same old story ever since Gameworks came about, they back closed-source tech, and devs go: "hmmm, i could develop for it, or develop for AMD and all consoles instead"

AMD really needs to get GPU parity soon, cause I can't wait to never buy nVid again. Fuck their closed-source tech that no one ever uses.
 
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
24. Re: Saturday Tech Bits Aug 18, 2018, 23:22 CJ_Parker
 
Slick wrote on Aug 18, 2018, 22:43:
So slower clocks, and 15% more Cuda cores you honestly think will make a "generational leap" when it comes to the new architecture?

See, you are once again jumping to conclusions Wink . There are slower clocks only on paper but let's wait and see for real world clocks.

Maybe, if RT is not used, the GPU will actually come close to the leaked 2.5GHz because the RT cores are disabled while the rest of the chip switches into overdrive. That's a pretty realistic possibility.

Also, let's look at Pascal again: My GTX 1080Ti has an advertised boost clock of 1670MHz. It never ever clocks that low. It always hits 1910+, no matter what I throw at it.

It's useless to speculate from "on paper" clocks how oh-so disappointing Turing will be. Yeah, it could be disappointing or it could also have some aces up its sleeve like smart boosting depending on whether RTX cores are used or not.

We'll see in due time... the TDP of 285W is actually an indication that the real world boost clocks will probably WAY exceed the on paper clocks via a high power target.

Otherwise, from the Anandtech article I linked previously...

As for the CUDA cores, NVIDIA is saying that the Turing GPU can offer 16 TFLOPS of performance. This is slightly ahead of the 15 TFLOPS single precision performance of the Tesla V100, or even a bit farther ahead of the 13.8 TFLOPS of the Titan V. Or if youíre looking for a more consumer-focused reference, itís about 32% more than the Titan Xp. Some quick paper napkin math with these figures would put the GPU clockspeed at around 1730MHz, assuming there have been no other changes at the SM level which would throw off the traditional ALU throughput formulas.

[ ... ]

Otherwise with the architectural shift, itís difficult to make too many useful performance comparisons, especially against Pascal. From what weíve seen with Volta, NVIDIAís overall efficiency has gone up, especially in well-crafted compute workloads. So the roughly 33% improvement in on-paper compute throughput versus the Quadro P6000 may very well be underestimating things. As for consumer product speculation, Iíll hold off on that entirely.

^ The Turing RTX 8000 has 4608 cores and simply going by core throughput at 1730MHz (ignoring all other architectural optimizations) should be ~30% faster than TitanXp. The 2080Ti will have 4352 cores and it will easily boost above 1730MHz.
So, less cores but maybe some architectural benefits... it is not unrealistic that the 2080Ti will outperform the 1080Ti by a good 30%.

I'm not sure what you were expecting. A +500% performance increase from Pascal?
Come on. It was to be expected that this small shrink to 12nm (which is actually 16nm+) plus the new architecture would yield a typical +25% to +30% max performance increase between comparable cards of the previous gen.

We'll see a huge leap in RT obviously since there is dedicated hardware for it now but otherwise we are looking at a normal generational leap, yes.
I'm sure nVidia will put extra marketing effort into touting the RT differences as they did with Pascal in VR. Due to the new SMP projection technology that came with GP-104 they emphasized huge performance leaps in VR. Same with RT this time around.

I can't say I care too much about RT either. Modern game engines have become really good at faking light sources and reflections so I doubt that there will be a visually stunning difference at first. In the heat of the action you would probably not even notice when blind-playing a sample segment.

I'm sure it has potential to become awesome eventually when games are fully converted to make use of RT but that will probably take at least until the RTX 4080 or 5080 series for playable results (4K@60fps).
 
Reply Quote Edit Delete Report
 
23. Re: Saturday Tech Bits Aug 18, 2018, 22:46 Slick
 
Also, I was clearly talking about the specs on paper re: Pascal. The RTX 2080 is EFFECTIVELY a rebrand because it looks like it won't have more than 10% more FPS in games than a 1080 Ti. It's a crossgrade more than anything, they are expecting people who already own top-tier cards to want to plonk down another wad of cash to get hardware ray-tracing.

So on Monday, if that raytracing shit doesn't blow everyone out of the water, then what is there to get excited about? The RTX 2080, 2 years in the making, is going to be barely an evolutionary bump in FPS for 99% of gamers.

Unless enabling RTX somehow INCREASES your total FPS by diverting computational effort from rasterization, shadows, lighting etc. to a dedicated piece of hardware. And that would be something worth getting excited about.
 
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
22. Re: Saturday Tech Bits Aug 18, 2018, 22:43 Slick
 
So slower clocks, and 15% more Cuda cores you honestly think will make a "generational leap" when it comes to the new architecture?

That's being rather optimistic.

There's a small die-shrink, which has resulted in slower clocks... and a massive 285w TDP, so that's basically a wash.

It was obviously to make room for the RTX hardware on-die. But I don't think that's going to be worth it, because enabling it will still probably slow the overall framerate, if ANY of the other gameworks techs over the past 10 years is any indication.

GDDR6 and 12nm is only good if that goes into getting me more frames in-game. If that's just to give additional bandwidth and energy efficiency so they can have 1/4th of the die with this RTX ray tracing thing, then it seems like a wasted attempt.

Feels just like Intel wasting half their die area on a stupid integrated GPU that I'll never use. the RTX core will be more of the same. Unless they can somehow make the lighting/shadows look better WITHOUT sacrificing the framerate, which is yet to be seen.

Still, I want the frames to power either a 144hz 4k monitor, or a 240hz 1440p monitor. I have no interest in being an early adopter for another gameworks bullshit marketing stunt that 3 developers will ever use.
 
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
21. Re: Saturday Tech Bits Aug 18, 2018, 21:59 CJ_Parker
 
Slick wrote on Aug 18, 2018, 21:20:
Also, I doubt that any benchmarks are going to be using RayTracing when comparing against other cards. Seeing as they're the only one with dedicated hardware, and they're using a propriatary API, then there's really no comparison, let alone no games...

Well, 3D Mark with DirectX ray-tracing support will be out later this year so we'll have a proper comparison tool then.
I think it is rather pointless with regard to old cards though. They will all be much too slow to properly run RT and even the RTX series will struggle with playable fps.

And from the looks of it, slower GPU clocks, and a handful more Cuda cores on paper, means that it doesn't look like the RTX 2080 is going to be worth anyone's time, as it'll just be a rebranded 1080ti with some extra shit that will "never" be used.

The RTX 2080 a rebrand? Sorry but that is very uneducated talk. You may want to do some reading on the architectural changes that come with Turing like this excellent article.

It's an all new architecture. No one knows right now how it will respond to clocks or how much performance boost is coming purely from the architectural optimizations. That analysis will be very interesting once the cards are literally on the table.

You are jumping to way too premature conclusions and simply assuming that it is mostly a Pascal rebrand and that it will behave similarly with regard to MHz and boost clocks etc.

We don't know that yet. It's certainly an evolutionary and not a totally revolutionary step but still... let's wait and see what the independent reviews turn up.

I say this as someone who's been hyped for this, and equally disappointed.

Well, I'm really looking forward to the RTX series and to upgrading this fall (i9-9900K + 2080 or 2080Ti) but I have always had realistic expectations.

Turing is based on an optimized 16nm process that TSMC marketing is calling 12nm. It is a far cry from the jump from 28nm (Maxwell) to 16nm (Pascal) that we've seen before.

The architectural changes are evolutionary and we now get GDDR6 instead of GDDR5(X).

That is pretty much all. There was no real reason to expect the Second Coming so I never did.

I'm also quite sure that nVidia did not really want to wait over two years. They probably wanted to release these cards by May at the very latest but mining fucked up everything real good which is why they are now forced to expedite the release of the new series with GTX 2080Ti being available from the start.
 
Reply Quote Edit Delete Report
 
20. Re: Saturday Tech Bits Aug 18, 2018, 21:20 Slick
 
Pascal was said to OC to 2.1 Ghz on air. My GTX 1080 OCs to 2.05 on air. That's comparable.

Saying RTX 2080 Ti will OC to 2.5Ghz, and then we're looking at MAYBE hitting 1.9Ghz with voltage is much different. We were told that we'd be getting higher clocks, and looks like they'll actually be lower than the last gen. That's a kick in the nuts.

They're just leaks, but I'm saying that expectations are very different for Turing pre-launch from actual.

Also, I doubt that any benchmarks are going to be using RayTracing when comparing against other cards. Seeing as they're the only one with dedicated hardware, and they're using a propriatary API, then there's really no comparison, let alone no games...

So we're talking benchmarks with the dedicated RT hardware not functioning, it's the only apples-to-apples we have. And from the looks of it, slower GPU clocks, and a handful more Cuda cores on paper, means that it doesn't look like the RTX 2080 is going to be worth anyone's time, as it'll just be a rebranded 1080ti with some extra shit that will "never" be used.

I say this as someone who's been hyped for this, and equally disappointed.

Let's wait to see the real benchmarks from 3rd party sites before we claim anything for sure, but it doesn't look good on paper.
 
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
19. Re: Saturday Tech Bits Aug 18, 2018, 21:15 CJ_Parker
 
Slick wrote on Aug 18, 2018, 20:59:
Also, leaks said the new cards would hit 2.5Ghz. Seeing that it's 1.5Ghz out of the box is a pretty big difference.

Well, then either the leaks were fake bullshit or maybe we will see those high clocks in spite of the relatively low "on paper" boost clocks.

Or, as I said, the cards will boost to 2.xGHz as long as no RT is involved and the lower "on paper" clocks are only applicable for "worst case" scenarios, i.e. when RT is at work.

Personally, I think the leaks were just bullshit. It was similar with Pascal. There were a lot of rumors that it would "easily" OC to 2.1/2.2GHz on air but that never really happened except for short bursts in some synthetic benchmarks.

For most Pascal 1070 - 1080Ti cards, to keep them 100% stable even in demanding games, the ceiling was around the 2GHz mark. Reaching that last 100MHz was pointless anyway.
Most people probably just wanted to break through that "psychological" 2000MHz barrier but whether the card runs just under 2000MHz or at 2.1GHz... who cares? That's a 5% difference in pure MHz which translates to a negligible performance difference of maybe 2% or 3% max.
 
Reply Quote Edit Delete Report
 
18. Re: Saturday Tech Bits Aug 18, 2018, 20:59 Slick
 
RTX is the proprietary nVidia tech. That's different from the DX12 API.

Also, leaks said the new cards would hit 2.5Ghz. Seeing that it's 1.5Ghz out of the box is a pretty big difference.
 
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
17. Re: Saturday Tech Bits Aug 18, 2018, 20:46 CJ_Parker
 
Mordecai Walfish wrote on Aug 18, 2018, 17:03:
How many games are going to adopt RTX without having nVidia's GEforce Game Ready fist up their ass? I'm guessing not many with the next console generation likely going with AMD hardware again. It's just an unnecessary development cost when the "main" gaming platforms arent going to support it, and AMD doesnt really have any incentive to shoe-in compatibility for nVidia's standards. Will be interesting to see where this goes.

Ray-tracing is a part of DirectX 12, dude. It's not a proprietary nVidia thing or anything like that. Any garage dev can address RT via the DirectX API if they feel like it.

Since we know that Microsoft does not give half a flying fuck about PC gaming, it is not too hard to guess why they made it a part of DirectX.

It's for the next Xbox generation, of course. Next gen consoles, as well as next gen AMD GPUs, will definitely support hardware RT via DirectX 12.

Until then, yes, we will only see ray-tracing supported in case nVidia has bribed devs to do so but RT will most likely become a standard from 2020 onward when the new consoles are here.

Maybe nVidia does not even need to bribe all that hard. For the big studios it probably makes sense to begin training their staff now and let them gain some RT DirectX experience ASAP so they'll be ready to produce good quality RT results as soon as the next gen of consoles is out.
 
Reply Quote Edit Delete Report
 
16. Re: Saturday Tech Bits Aug 18, 2018, 20:35 CJ_Parker
 
Slick wrote on Aug 18, 2018, 14:58:
Also, the advertised clock speeds look pretty low compared with the supposed leaks in the previous weeks. They were saying 2.1 Ghz, with voltage boost clocks up to 2.5 Ghz. Now it's 1.3 boosting to 1.5? with voltage what? 1.7? 1.8? I don't know if the additional Cuda count will offset this to make it worthwhile.

Umm... when was the last time that nVidia-based cards were able to reach only the exact advertised boost clock? Right. Never.

I have personally owned three Gainward Golden Sample variants of the Pascal series. A GTX 1070, 1080 and now a 1080Ti. They have advertised boost clocks of 1835MHz for the GTX 1070, 1847MHz for the GTX 1080 and 1670MHz for the GTX 1080Ti.

Every single one of these cards typically ran (still runs in the case of the 1080Ti) at around the 1950MHz mark in gaming. They sometimes hit the 1970s and the lowest I have seen in extremely demanding scenarios was around 1910MHz but a typical core clock I'm seeing most of the time is 1949Mhz.
None of the cards ever dropped below 1900Mhz or came even close to the "low" advertised boost clocks.

I'm sure it will be the same with the RTX series. They will clock much higher than advertised.
Besides, the Titan-V is a beast in spite of relatively low MHz. Turing is an all new architecture. It could outperform Pascal at lower clocks.

It is also possible that nVidia has implemented "smart boost" capabilities where the cards will boost north of 2000Mhz if no ray-tracing cores are used and that the relatively low clocks will only apply if ray-tracing is involved.

I guess we'll find out more on Monday, though independent reviews might come out at a later date since it seems like nVidia have not shipped review samples yet.
 
Reply Quote Edit Delete Report
 
15. Re: Saturday Tech Bits Aug 18, 2018, 18:05 Slick
 
Ozmodan wrote on Aug 18, 2018, 18:00:
Lord Tea wrote on Aug 18, 2018, 15:14:
jdreyer wrote on Aug 18, 2018, 15:04:
Are there any games that take advantage of real time ray tracing?

Yes. Battlefield 5, Metro Exodus... just to name a couple.

Why would anyone use ray tracing when even with a top end card it slows down your computer significantly?

It's not hard to imagine RT being used in indy or smaller games where it's not already as taxing as a AAA game. Something like Unravel, it's simple, looks great, isn't too GPU demanding... that shit could benefit if it can run at 30fps and still play well.

But yeah, touting 12 FPS in their promo video as "6x faster than before" is still worthless for any big game.
 
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
14. Re: Saturday Tech Bits Aug 18, 2018, 18:00 Ozmodan
 
Lord Tea wrote on Aug 18, 2018, 15:14:
jdreyer wrote on Aug 18, 2018, 15:04:
Are there any games that take advantage of real time ray tracing?

Yes. Battlefield 5, Metro Exodus... just to name a couple.

Why would anyone use ray tracing when even with a top end card it slows down your computer significantly?
 
Reply Quote Edit Delete Report
 
13. Re: Saturday Tech Bits Aug 18, 2018, 17:17 Slick
 
Mordecai Walfish wrote on Aug 18, 2018, 17:03:
How many games are going to adopt RTX without having nVidia's GEforce Game Ready fist up their ass? I'm guessing not many with the next console generation likely going with AMD hardware again. It's just an unnecessary development cost when the "main" gaming platforms arent going to support it, and AMD doesnt really have any incentive to shoe-in compatibility for nVidia's standards. Will be interesting to see where this goes.

The answer will undoubtedly be zero.

They've already shown their utter distaste for open standards by avoiding HDMI 2.1 cause that competes slightly against their BFG g-sync TV...

RTX will be gameworks only, I guarantee it.

I should specify that MS has a new Ray Tracing API "plug-in" for DX12. This will be the open-source tech that AMD will use, probably see mass adoption in the next round of consoles... Depending on how efficient they get it, at least it will be another big marketing push.

This comment was edited on Aug 18, 2018, 18:03.
 
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
12. Re: Saturday Tech Bits Aug 18, 2018, 17:03 Mordecai Walfish
 
How many games are going to adopt RTX without having nVidia's GEforce Game Ready fist up their ass? I'm guessing not many with the next console generation likely going with AMD hardware again. It's just an unnecessary development cost when the "main" gaming platforms arent going to support it, and AMD doesnt really have any incentive to shoe-in compatibility for nVidia's standards. Will be interesting to see where this goes.  
Avatar 56178
 
Reply Quote Edit Delete Report
 
11. Re: Saturday Tech Bits Aug 18, 2018, 16:43 jdreyer
 
Slick wrote on Aug 18, 2018, 15:13:
jdreyer wrote on Aug 18, 2018, 15:04:
Are there any games that take advantage of real time ray tracing?

Short answer is we'll know if any on Monday.

There's a new Direct X ray tracing API thingy that's coming out, which AMD have said they will support. But TBH There's no games that even really support DX12 yet... it's gonna be hard to anticipate any big games suppoorting RT in the near future.

That being said, you'd think it would be suicide to have this big hardware announcement without attaching at least one AAA studio to reveal they will support RT in a game this holiday season. I mean, they already do that shit with their gameworks program on PC with the big studios.
Good analysis, thanks.
 
Avatar 22024
 
The land in Minecraft is flat, Minecraft simulates the Earth, ergo the Earth is flat.
Reply Quote Edit Delete Report
 
10. Re: Saturday Tech Bits Aug 18, 2018, 16:40 Slick
 
Nice, thanks for the link.

I haven't seen any options in the closed alpha, but I just found out that Metro also uses Frostbite, so it's not much of a stretch for BFV to support it in some capacity.

That would be a AAA title worthy of a hardware launch.

We'll see if enabling it is worth anyone's time though... as with many other Gameworks techs, it's always been a huge tradeoff for FPS. Like has ANYONE ever used the percentage PCSS shadows supported by nVid in any game? it like halfs my framerate...
 
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
9. Re: Saturday Tech Bits Aug 18, 2018, 16:29 Lord Tea
 
Slick wrote on Aug 18, 2018, 15:27:
Lord Tea wrote on Aug 18, 2018, 15:14:
jdreyer wrote on Aug 18, 2018, 15:04:
Are there any games that take advantage of real time ray tracing?

Yes. Battlefield 5, Metro Exodus... just to name a couple.

Wait, Battlefield 5 is confirmed? I've read about Metro Exodus, but there's no "real" proof, other than it will eventually be supported, to my eyes nothign at launch, which is still a ways off, so just seems like bullshit to me until I see something more official from the devs.

I caught this metro trailer last night saying that it'll have ray tracing, I didn't realise that this trailer was supposedly SHOWING OFF the raytracing.

Just like DX12, if the game isn't built from the ground-up to support it, this tacked-on shit doesn't impress me at all, I honestly didn't see anything impressive about that metro trailer, just looks like a current gen console game with a slightly nicer PC treatment.

The devs are experimenting with RT in early builds. So, there is a pretty high chance RT will make it into the final game. I only have a link to the German PCGames Hardware site. In their screenshot RT is off but they're using a 1080 Ti.
 
Anytime. Anywhere. Anyone.
[Tagline: John Carpenter's "The Thing"]
[Tagline: United Police States of America]
Reply Quote Edit Delete Report
 
8. Re: Saturday Tech Bits Aug 18, 2018, 15:55 Slick
 
Sorry for the repeat posts, I've been looking forward to the next gen of GPUs for a while, as I'm sure we all have...

Check out this video at this time, this seems to be where nVidia is getting their "6x performance" metric from. Which is entirely just the ms it took to render a frame with raytracing before, to how long it takes now.

What used to be 308ms is now 45ms. What used to be 576ms is now 86ms.

A big improvement for sure, but this isn't going to do shit for modern games! What gamer is going to put up with 86ms between frames instead of 576ms?!

That's literally going from 2 FPS to 12 FPS. Woo hoo? Who the fuck cares! And that was for a damn simple scene with just one car onscreen.

Fuck man, I was so stoked about these new GPUs, but it seems like they're just wasting extra silicon on a marketing stunt while nerfing the true performance gains we should be expecting with 2 years of development and a die shrink to be just barely better than Pascal.

Oh and no HDMI 2.1

WTF... so conflicted. I hope their announce on monday will clear things up, but it doesn't look good. Not until AMD can counter with something of their own, this seems like a slap in the face for gamers. The 2080 is like 5% faster than a 1080ti. For about the same price. WTF is that shit...
 
Avatar 57545
 
For your transgressions you shall be labeled a shill, called an idiot and anytime you mention facts or disagree with a tribe member you will henceforth be known as a troll. The best you can hope for is that the labels won't haunt your offspring. -RedEye9
Reply Quote Edit Delete Report
 
27 Replies. 2 pages. Viewing page 1.
< Newer [ 1 2 ] Older >


footer

Blue's News logo