Meet Digital Humans Siren and Andy Serkis

outrider42outrider42 Posts: 3,679
edited March 2018 in The Commons

Siren is created in the Unreal Gaming Engine, and she is being rendered in real time, and her her acting can be done in real time with the actress.

There is more where this came from. https://www.polygon.com/2018/3/21/17147502/unreal-engine-graphics-future-gdc-2018

Andy Serkis is also getting the digital treatment.

Unreal was already used to create K-2SO droid in Rogue One. Someone asked the question of how much post work was used? Well, if you go by this scene here, maybe not much. This video is created entirely with Unreal. Everything you see is running real time on DGX-1, the backgrounds, the lights, everything.

Using a game engine to render is coming, and it is coming faster than I expected. I fully expect many CGI shots in future Star Wars films to utilize this technique more and more, along with the rest of Hollywood and studios around the world.

As for the rest of us, it will take a while longer. But real time ray tracing is already being introduced into gaming as part of DirectX12 soon and both AMD and Nvidia are building it into their next line of GPUs.

Post edited by outrider42 on

Comments

  • nonesuch00nonesuch00 Posts: 18,750

    Quite impressive the expressions & mannerisms but one could see much detail was not rendered to enable realtime rendering; nevertheless it still looks more (lowres) photorealistic than most highres DAZ Studio renders.

  • TomDowdTomDowd Posts: 200

    It is very impressive, but keep in mind that the Star Wars Phasma pieces - and probably the others - were all running on NVidia's $60,000 DGX Station hardware. (https://www.nvidia.com/en-us/data-center/dgx-station/.) So, yea, Hollywood will be using it but we're a ways away from seeing that for the rest of us. :) 

    (They said it was a DGX-1 during the demo, but Epic clarified that it was a DGX Station.)

    https://arstechnica.com/gaming/2018/03/star-wars-demo-shows-off-just-how-great-real-time-raytracing-can-look/

  • GregoriusGregorius Posts: 397
    edited March 2018

    This is awesome, though as TomDowd noted, it will probably be quite a while before these capabilities are affordable for the average consumer.  As for Hollywood, I can't help but think how this could be used to broaden casting options.  Any sufficiently talented actor could be mocapped onto a digitial model of a different person who's unavailable or perhaps even deceased.  Does someone have the perfect look for your character but lack the talent?  No problem!  Making a sequel but can't get that one original actor to reprise his role from the prequel?  Again, no problem, and your audience need not suspend disbelief in even a mildly different look for the character due to a discontinuity in casting!

    Heck, with aging morphs like what Daz has already shown to be possible, even if the model were based on the actor behind it, the studio could digitally turn back the biological clock for flashback scenes instead of finding a kid who just happens to look enough like his/her adult counterpart.  This could lead to the most believable flashbacks (or flashforwards) ever!

    Post edited by Gregorius on
  • BobvanBobvan Posts: 2,653
    edited March 2018

    Bring back Star Trek TOS..

    Post edited by Bobvan on
  • ChadCryptoChadCrypto Posts: 596

    They are getting so close. Last year was Hellblade game using the same tech. This year this. I think they have crossed 90 percent  of the uncanny to almost real.They still got to solve the human eyes. Which is the biggest hurdle.

     

  • kyoto kidkyoto kid Posts: 41,882
    Bobvan said:

    Bring back Star Trek TOS..

    ...yes

  • outrider42outrider42 Posts: 3,679
    TomDowd said:

    It is very impressive, but keep in mind that the Star Wars Phasma pieces - and probably the others - were all running on NVidia's $60,000 DGX Station hardware. (https://www.nvidia.com/en-us/data-center/dgx-station/.) So, yea, Hollywood will be using it but we're a ways away from seeing that for the rest of us. :) 

    (They said it was a DGX-1 during the demo, but Epic clarified that it was a DGX Station.)

    https://arstechnica.com/gaming/2018/03/star-wars-demo-shows-off-just-how-great-real-time-raytracing-can-look/

    That's even better then. The DGX-1 is a $150,000 system, so being able to run this movie quality scene on a $60,000 system is even more astounding.

    Consider the original Titan from 2013. That card was a beast, but only a few years later and it has been completely surpassed by regular consumer level cards. The DSG has 4 big Teslas in it. I believe it will be less than 10 years to top them with consumer cards, and you will be able to build a system that can do this for much less than $10,000 in few years. Progress is happening fast.
Sign In or Register to comment.