At what point will we see a jump in rendering.

UsernamenottakenUsernamenottaken Posts: 53
edited February 2020 in The Commons

I know not much about Iray and how fast it renders. Whether it is efficient or slow with todays standards. Regardless - where do we stand with the future of rendering? To me rendering a still image is totally fine. I have a 2080ti and it does more than enough for a render. If I want a short few second test clip I can usually pump out a minimal scene, 1080p, 60FPS overnight if I have most things bare minimum. Maybe 250 iterations each frame?

But what's to come? Are we stuck relying on faster GPUS or will it come from the software end?

Post edited by Usernamenottaken on

Comments

  • fred9803fred9803 Posts: 1,565

    You've already got just about the best GPU there is for Iray rendering at the moment. NVIDIA will (porportably) be releasing the 3000 series cards later this year so there be a performance boost with the RTX 3080ti over the 2080ti.

    Rendering animations always takes a long time... even for studios with render farms. I have no idea of anything big around the corner, software or hardware, that might dramatically reduce render times.

  • wolf359wolf359 Posts: 3,936
    karia007 said:

    I know not much about Iray and how fast it renders. Whether it is efficient or slow with todays standards. Regardless - where do we stand with the future of rendering? To me rendering a still image is totally fine. I have a 2080ti and it does more than enough for a render. If I want a short few second test clip I can usually pump out a minimal scene, 1080p, 60FPS overnight if I have most things bare minimum. Maybe 250 iterations each frame?

    But what's to come? Are we stuck relying on faster GPUS or will it come from the software end?

    It will come from the software end.. Unreal engine and blender EEVEE are examples of how better software is dealing with rendering animation& VFX.....Brute force solutions that force you to keep upgrading hardware is fine if you happen to$$ sell GPU hardware$$ as NVIDIA does.... I am moving my filmaking pipeline to Blender 2.8x
  • alexhcowleyalexhcowley Posts: 2,403
    fred9803 said:

    You've already got just about the best GPU there is for Iray rendering at the moment. NVIDIA will (porportably) be releasing the 3000 series cards later this year so there be a performance boost with the RTX 3080ti over the 2080ti.

    Rendering animations always takes a long time... even for studios with render farms. I have no idea of anything big around the corner, software or hardware, that might dramatically reduce render times.

    At one point, George Lucas's special effects company, Industrial Light and Magic, reputedly had more computing power than NASA!

    Cheers,

    Alex.

  • AsariAsari Posts: 703
    edited February 2020
    I agree that it needs to come from the software end. True, every new GPU generation will be more powerful than the former but then the requirements will rise. The Turing generation is miles ahead of the Kepler generation in terms of performance. But back then we didn't render mesh hair and cloth models with millions of vertices and subdivision 4 figures ... nowadays we render hair and trees where each strand and each leaf is modelled albeit instanced. And outside if DAZ 8k textures seem widely used. Not to forget real time rendering.

    But even for stills, it is narrow. By the time the 3080ti rolls around, 8k displays might already be a thing. The 2080ti holds well in terms of 4k rendering but for real time raytracing in 4k it doesn't perform all too well. My guess is the 3080ti will hold up better with 4k raytracing but will not be ready for 8k raytracing. In terms of 8k stills rendering it shall perform equally to 2080ti in 4k. So there goes the performance gain.

    Better hardware only means faster rendering if the requirements are equal. But they change over time, too. 8 years ago FHD was amazing. Nowadays it's merely smartphone quality. And even this waning ...

    Post edited by Asari on
  • the issue with GPU rendering is everything has to be loaded on your graphics card so you are always going to limited by size

    things like out of core textures can help 

    the advantage is of course graphics cards are cheaper than powerful CPUs

    but if money not the object most professional production companies will use render farms not racks of graphics cards

  • Software will improve but you cannot expect massive improvements from that side. If there were truly significant gains to be had from better algorithms or optimization then the big VFX houses would be paying to get that done.

    Hardware improvements are where you can expect the most speed improvements. To start with CPU compute power has grown by a factor of 2 in less than 5 years.In 2016 the absolute best CPU you could get, for this sort of workload, was a Broadwell Xeon with 22c/44t in a dual socket system (so 44c/88t total). Now Epyc Rome can give you a 64c/128t CPU, so a dual socket setup is 128c/256t (bpth CPU's run at roughly the same base clock and the Epyc has an IPC advanatge as well). 

    The improvement in GPU compute in that same timespan have been equally impressive. The 980ti was the flagship Nvidia GPU and it was 2800 CUDA at a base clock of, IIRC, 1.1 Ghz with 6Gb of VRAM. Today you can get the 2080ti with 4300 CUDA, 1.3 Ghz clock and 11Gb VRAM. The Turing cards also give you the Tensor and RT pipelines which can both be leveraged for significant gains in rendering.

    Just look at those numbers and then consider what it would be like rendering a Genesis 2 scene today on modern HW.

  • nicsttnicstt Posts: 11,715

    You're from a gaming background?

    Just curious, as there have been continual improvements; RTX and AMD's alternative should give some decent improvements over time.

    ... And the more they do, the more folks add 'features'.

  • RobinsonRobinson Posts: 751
    wolf359 said:
    I am moving my filmaking pipeline to Blender 2.8x

    How can you use other tools with Daz content given the fact we cannot export high resolution (with skinning and weight maps) models?  I've played with Unreal Engine just to see what's up and I just couldn't get the look I wanted with fbx exports, going through Maya.  I played with various smoothing options in Maya to increase the poly count but you lose the weight mapping.

  • wolf359wolf359 Posts: 3,936
    Robinson said:
    wolf359 said:
    I am moving my filmaking pipeline to Blender 2.8x

    How can you use other tools with Daz content given the fact we cannot export high resolution (with skinning and weight maps) models?  I've played with Unreal Engine just to see what's up and I just couldn't get the look I wanted with fbx exports, going through Maya.  I played with various smoothing options in Maya to increase the poly count but you lose the weight mapping.

    I am an animated filmmaker/storyteller..... my sci fi narratives DO NOT depend on replicating IRAY ,HD Still render quality in my other external environments....I am not a Still render pinup/portrait maker. Most of my subjects are part of a larger environment and fully dressed(with my custom clothing).....and moving. This is a straight FBX export from Daz studio...... These weighted joint deformations are good enough for me...YMMV :-)
  • AI implemented in hardware. All your 2080s are going to be replaced by clockless ASICs that draw 20 watts.

  • The Future is here!  REAL TIME RAY TRACING with 30 Frames in HD.

    RT Cores on the GPU IS the jump in Rendering:



    I waited 30 years for real time rendering with Ray Tracing - i started in the 90ths with a Renderman License and a 286er IBM.. Just the Graphic Card and a 19" Eiso Monitor was 16 000 USD... THe whole system with software License 60 000 USd.... One decent indoor Render toke about 1 Week for one Image crying.

    In my Eyes the future is allready here...and just made me invest in a new Ryzen x570 PC with a 2070 Super RTX...for 1200 USD --   Because i really want to know if i can do this at home now...


    And after 2 Weeks testing in Unreal Engine i can say - YES! ...24 Frames in HD can be done with my Hobbist Setup!   But the Time for baking large Scenes and Lightmaps will still make me upscale my system to minimum 16 Cores....Or maybe a Threadripper? devil

    So ...  The Future is MORE CPU CORES AND MORE RT CORES...   and People with Talent - can do Cinematic Movies at home...

     

  • AndyGrimmAndyGrimm Posts: 910
    edited February 2020

    Double Posting - Sorry

    Post edited by AndyGrimm on
Sign In or Register to comment.