NVIDIA's Cloud Rendering

NVIDIA announces CloudLight, a cloud-based "system for amortizing indirect lighting in real-time rendering." They say this new framework "explores tradeoffs in different partitions of the global illumination workload between Cloud and local devices, with an eye to how available network and computational power influence design decisions and image quality." This video offers a look at what this means in case the following explanation isn't crystal clear:
We introduce CloudLight, a system for computing indirect lighting in the Cloud to support real-time rendering for interactive 3D applications on a user's local device. CloudLight maps the traditional graphics pipeline onto a distributed system. That differs from a single-machine renderer in three fundamental ways. First, the mapping introduces potential asymmetry between computational resources available at the Cloud and local device sides of the pipeline. Second, compared to a hardware memory bus, the network introduces relatively large latency and low bandwidth between certain pipeline stages. Third, for multi-user virtual environments, a Cloud solution can amortize expensive global illumination costs across users. Our new CloudLight framework explores tradeoffs in different partitions of the global illumination workload between Cloud and local devices, with an eye to how available network and computational power influence design decisions and image quality. We describe the tradeoffs and characteristics of mapping three known lighting algorithms to our system and demonstrate scaling for up to 50 simultaneous CloudLight users.

View : : :
9.
 
Re: NVIDIA's Cloud Rendering
Jul 29, 2013, 11:18
9.
Re: NVIDIA's Cloud Rendering Jul 29, 2013, 11:18
Jul 29, 2013, 11:18
 
I just noticed after watching the video with actual audio on that this is not supposed to replace normal PC Game or console game engines/Gpu work but rather to allow for realistic GI on devices with inferior or low power GPU's running at 30fps. So indeed 50ms latency is probably closer to where it's at... with 30fps - 100ms latency would be extreme.

I wonder if it weren't more efficient to make a dedicated chip for this specific thing though. Because I don't see GI ever being replaced by anything else. Only way to get realistic light is to trace these light rays, implementations may vary, but the actual computations will never (in near future) become obsolete... and the more and the faster the merrier.

So maybe high-end GPU's should just have a dedicated photon trace chip...
Avatar 54727
Date
Subject
Author
1.
Jul 29, 2013Jul 29 2013
2.
Jul 29, 2013Jul 29 2013
3.
Jul 29, 2013Jul 29 2013
4.
Jul 29, 2013Jul 29 2013
5.
Jul 29, 2013Jul 29 2013
7.
Jul 29, 2013Jul 29 2013
 9.
Jul 29, 2013Jul 29 2013
   Re: NVIDIA's Cloud Rendering
10.
Jul 29, 2013Jul 29 2013
11.
Jul 29, 2013Jul 29 2013
12.
Jul 29, 2013Jul 29 2013
15.
Jul 29, 2013Jul 29 2013
19.
Jul 29, 2013Jul 29 2013
6.
Jul 29, 2013Jul 29 2013
8.
Jul 29, 2013Jul 29 2013
13.
Jul 29, 2013Jul 29 2013
14.
Jul 29, 2013Jul 29 2013
16.
Jul 29, 2013Jul 29 2013
17.
Jul 29, 2013Jul 29 2013
18.
Jul 29, 2013Jul 29 2013
20.
Jul 29, 2013Jul 29 2013
21.
Jul 29, 2013Jul 29 2013
22.
Jul 29, 2013Jul 29 2013