Meet Digital Humans Siren and Andy Serkis
Siren is created in the Unreal Gaming Engine, and she is being rendered in real time, and her her acting can be done in real time with the actress.
There is more where this came from. https://www.polygon.com/2018/3/21/17147502/unreal-engine-graphics-future-gdc-2018
Andy Serkis is also getting the digital treatment.
Unreal was already used to create K-2SO droid in Rogue One. Someone asked the question of how much post work was used? Well, if you go by this scene here, maybe not much. This video is created entirely with Unreal. Everything you see is running real time on DGX-1, the backgrounds, the lights, everything.
Using a game engine to render is coming, and it is coming faster than I expected. I fully expect many CGI shots in future Star Wars films to utilize this technique more and more, along with the rest of Hollywood and studios around the world.
As for the rest of us, it will take a while longer. But real time ray tracing is already being introduced into gaming as part of DirectX12 soon and both AMD and Nvidia are building it into their next line of GPUs.




Comments
Quite impressive the expressions & mannerisms but one could see much detail was not rendered to enable realtime rendering; nevertheless it still looks more (lowres) photorealistic than most highres DAZ Studio renders.
It is very impressive, but keep in mind that the Star Wars Phasma pieces - and probably the others - were all running on NVidia's $60,000 DGX Station hardware. (https://www.nvidia.com/en-us/data-center/dgx-station/.) So, yea, Hollywood will be using it but we're a ways away from seeing that for the rest of us. :)
(They said it was a DGX-1 during the demo, but Epic clarified that it was a DGX Station.)
https://arstechnica.com/gaming/2018/03/star-wars-demo-shows-off-just-how-great-real-time-raytracing-can-look/
This is awesome, though as TomDowd noted, it will probably be quite a while before these capabilities are affordable for the average consumer. As for Hollywood, I can't help but think how this could be used to broaden casting options. Any sufficiently talented actor could be mocapped onto a digitial model of a different person who's unavailable or perhaps even deceased. Does someone have the perfect look for your character but lack the talent? No problem! Making a sequel but can't get that one original actor to reprise his role from the prequel? Again, no problem, and your audience need not suspend disbelief in even a mildly different look for the character due to a discontinuity in casting!
Heck, with aging morphs like what Daz has already shown to be possible, even if the model were based on the actor behind it, the studio could digitally turn back the biological clock for flashback scenes instead of finding a kid who just happens to look enough like his/her adult counterpart. This could lead to the most believable flashbacks (or flashforwards) ever!
Bring back Star Trek TOS..
They are getting so close. Last year was Hellblade game using the same tech. This year this. I think they have crossed 90 percent of the uncanny to almost real.They still got to solve the human eyes. Which is the biggest hurdle.
...