Slick wrote on Feb 18, 2019, 07:24:
You think a multi-million $ AAA studio is going to cater to the 0.02% of users who have an RTX card? Or the 100+ million of them who have a PS4, Xbox, or "normal PC"?
Their focus will be getting the best image quality on the 99.8% of systems. It would be irresponsible for them to actually build a true RTX game, as they have zero hope in hell of ever recouping their investment.
Sorry guise, truth hurts.
I would have wholeheartedly agreed with that hurtful truth until about a week ago but ever since I have seen UbiSoft's PC sales numbers last week which skyrocketed for no good reason (no new Anno or other PC exclusive that would explain a spike) by a whopping 58% YoY I'm not so sure anymore.
This might all change again next year or in 2021 when the new toys (PS5 et al) arrive in stores but for now it looks like PC gaming is experiencing a resurgence as the current gen of conslows seem to be getting a bit long in the tooth.
That's why some AAA devs/publishers will probably take the RTX plunge. Let us also not forget that nVidia has excellent developer support by all accounts.
Implementing ray-tracing should be relatively little work for the actual devs. It is mostly done by on-site nVidia graphics engineers (in the future in tandem with AMD engineers as soon as AMD has hardware RT capable GPUs out there).
Shit, I'm still waiting for games to utilize all the great features of DX12, LOL!
Well, ray-tracing now is a part of the DirectX 12 API. Any garage script kiddie can download the DX12 SDK and start gigaraying the living fuck out of their Pacman clone.
nVidia are pioneers in that they have collaborated with MS to make RT a DX12 standard and by being the first ones to release GPUs with dedicated RT hardware but it is now up to DX12 developers to either adopt it or dump it.
What is really unfortunate is the GPU market. I'm wondering what it would have been like if there would have been no crypto boom, no excess stock and all those problems.
Maybe then a 7nm graphics card lineup would already be imminent with plans to move to 5nm next year.
TSMC has been trucking with nary a bump in the road the last couple of years.
If it weren't for economic factors then we could see much larger jumps in GPUs but those new very advanced processing nodes like 7nm(+) and 5nm and beyond are expensive as fuck for the hardware makers... btw, the price hikes are not just greed by Apple or nVidia or AMD but they have a basis in exploding manufacturing costs as well as R&D budgets (i.e. R&D both on the foundry level at TSMC as well as the hardware makers themselves).
These factors are really unfortunate for VR and RT since we are moving so slow in terms of hardware these days. For VR and RT to take off we really "need" a couple of years like the old days where AMD/nVidia released their new gens like in February and then the next one in November of the same year already.