Ray tracing explained: The future of hyper-realistic graphics

IvyIvy Posts: 7,165

A really informative article on rendering and lighting  this is a must see , includes a video ,

 This is really in depth in explaining raytrace lighting and the future of rendering dependent more on hardware than software. maybe why Iray is becoming more acceptable in rendering the gaming industry?

Ray tracing explained: The future of hyper-realistic graphics

Comments

  • jag11jag11 Posts: 885

    Thanks @Ivy, I really enjoyed watching the video. 

  • bluejauntebluejaunte Posts: 1,990

    Good summary.

  • IvyIvy Posts: 7,165

    When I got this article in my Photoshop creative news letter today, I kind of already knew this was where 3D rendering was heading, so IMO Hardware(if you can find it) will be your biggest investment for rendering & playing games, I can see a really huge need for render farms access for daz renders. 

  • KindredArtsKindredArts Posts: 1,332
    edited April 2018
    Ivy said:

    When I got this article in my Photoshop creative news letter today, I kind of already knew this was where 3D rendering was heading, so IMO Hardware(if you can find it) will be your biggest investment for rendering & playing games, I can see a really huge need for render farms access for daz renders. 

    Computational power isn't catching up to ray tracing very quickly, it's very resource intensive. However! The new deep learning denoisers give me much more hope. You can feed it only a minimal amount of samples and eek out a nice crisp image. Lookie here.

    Post edited by KindredArts on
  • IvyIvy Posts: 7,165

    @KindredArts  encuraging . In the article they did say the NIVDA was working on having GPU do raytracing computationals But like you said its very resource instensive

    Did you see the shot in the video of that huge render farm when they were taking about big studios rendering raytrace.?  can you image the possiablitiies having access to something like that without breaking the bank?. :D

  • KindredArtsKindredArts Posts: 1,332
    Ivy said:

    @KindredArts  encuraging . In the article they did say the NIVDA was working on having GPU do raytracing computationals But like you said its very resource instensive

    Did you see the shot in the video of that huge render farm when they were taking about big studios rendering raytrace.?  can you image the possiablitiies having access to something like that without breaking the bank?. :D

    It would be pretty great, but i bet it runs up a pretty steep power bill. smiley Still, with a mixture of hardware advancement, deep learning and general real-time interest (which is super popular atm with things like eevee and UE4), i think we'll see massive improvements over the next decade.

  • kyoto kidkyoto kid Posts: 41,848
    edited April 2018

    ...AI denoising is one area that Otoy is exploring for Octane 4.  May actually make that 3,000$ Titan-V with the 640 Tensor cores worth the cost for design and small animation studios. All it needs to really take advantage of it's power is NVLink (still uses SLI).

    Post edited by kyoto kid on
  • drzapdrzap Posts: 795
    kyoto kid said:

    ...AI denoising is one area that Otoy is exploring for Octane 4.  May actually make that 3,000$ Titan-V with the 640 Tensor cores worth the cost for design and small animation studios. All it needs to really take advantage of it's power is NVLink (still uses SLI).

    V-ray GPU is already using the NVLink.   I imagine that other renderers won't be far behind.

  • IvyIvy Posts: 7,165
    edited April 2018
    kyoto kid said:

    "All it needs to really take advantage of it's power is NVLink (still uses SLI)."

    Wouldn't being able to use SLi in rendering be a good thing? then we could combine the gpu's to increase the computations, kinda of like it does in rendering game play with SLi combining gpu power. Though i doubt it help all that much when trying to do ray-tracing, because it takes so much calculations, especially when adding in occlusions for refined quality of the render .. still be great to access the GPU's in SLi mode for rendering though

    Post edited by Ivy on
  • kyoto kidkyoto kid Posts: 41,848
    edited April 2018
    drzap said:
    kyoto kid said:

    ...AI denoising is one area that Otoy is exploring for Octane 4.  May actually make that 3,000$ Titan-V with the 640 Tensor cores worth the cost for design and small animation studios. All it needs to really take advantage of it's power is NVLink (still uses SLI).

    V-ray GPU is already using the NVLink.   I imagine that other renderers won't be far behind.

    ...sadly the only GPUs that can take advantage of NVLink are the Tesla V100, Quadro GV100, and GP100 (the lowest cost of the bunch at around 7,200$), all of which pretty much out of our budgets.

    Post edited by kyoto kid on
  • kyoto kidkyoto kid Posts: 41,848
    Ivy said:
    kyoto kid said:

    "All it needs to really take advantage of it's power is NVLink (still uses SLI)."

    Wouldn't being able to use SLi in rendering be a good thing? then we could combine the gpu's to increase the computations, kinda of like it does in rendering game play with SLi combining gpu power. Though i doubt it help all that much when trying to do ray-tracing, because it takes so much calculations, especially when adding in occlusions for refined quality of the render .. still be great to access the GPU's in SLi mode for rendering though

    ..SLI offers little to no benefit for rendering.  It does not allow for memory pooling (stacking) like NVLink does.  The only advantage will be the additional cores which can reduce render times and improve display response in full Iray View mode.  Where SLI usually comes into play is in being able to swap between multiple GPUs to enhance frame rate in games.

Sign In or Register to comment.