Virtual Production & Daz
This is an older video, but I watched it today and wondered if there could be any application with Daz. After all, we have actors and background and we can create "virtual" LED screens.
Would it be perhaps possible to model a "stage" to mimic their virtual production? They even get lighting from a virtual sky.
The only trick, I think, would be to to match the perspectives and parallax a camera would record in real life. They have special tech for this -- I wonder if a talented person could rig up something similar?
Just spitballing. 

Post edited by Chohole on


Comments
...so, how long until the first holodeck?
what... why would we need to put daz characters in front of a screen that represents an environment when we can put them in an actual environment
Environments are great, but they can cost a lot of vram, especially if you have multiple characters.
Also, sometimes Daz doesn't have the environment you need, whereas there are million upon millions of high-resolution photographs.
arent you just decribing an hdri then? (and you couldn't just slap a photo for it to work the way the screen works you would need light information that a normal photo doesn't provide)
HDRI's are also great, but they're...
a) Difficult to work with because most aren't really designed to serve as environments.
b) Limited in their use because of scaling issues, orientation, etc.
Hm. Why wouldn't a virtual "LED screen" work the same way as they are in the above example? They're lighting the scene with LED screens, not with an HDRI. No doubt it's not entirely accurate, but since Mandalorian looks pretty amazing, it seems to be "good enough."
The trick (maybe) is you need those screens in 360 degrees and above and below.
...in spite of the render load I still prefer building settings with geometry over HDRIs as the setting can be tailor made for the scene. I've also had issues with some HDRs where the "sun" had a lower luminosity value than the Iray sun-sky and shadows had too hard an edge. I've only sparingly used photo backdrops and that was when I actually needed a specific RL location. Even then it was just a backdrop on a plane primitive while pretty much all of the foreground was built with geometry. In some cases I even would mat out the sky in the photo to use a skydome which gives more depth.
The one shortcoming with Iray is you can't turn shadows off so anything that can cast a shadow on the photo background is problematic.
The mandelorian setup involves realtime rendering. the director moves the camera the projected background moves to account for things like paralax. its main advantage over something like a greenscreen is a) the director can see whats the actual finished shot looks like b) the real world actors are actually lit so the blend better with the background
since we're working in a situation with 3d characters b is irrelevant
in addition using a large subdivided mesh with a large detailed image as emmission is way more computationally expensive for what would function identically to an hdri
the mandelorian setup is basically a real world hdri - its added advantages are purely the result of it being a realtime render rather than a still image. But its pretty explicitly not a way around renderingyour background as it is, you know, a rendered background
Yes, I know. I read the article and mentioned this in my initial post.
Just as they could light the actors (if necessary, I actually don't see this in any of the articles), we could do the same.
This would be potentially useful in large scenes, not small rooms.
To an HDRI? Possibly. To a whole environment? Unlikely.
Despite all these objections, it looks like Dreamlight already does something like this...
https://www.daz3d.com/dead-tree-desert
https://www.daz3d.com/3d-photo-scenes--weird-places
...and it looks pretty great!
Right thats an HDRI you are describing an HDRI
I meant it would be more computationally complex than an hdri and function identically to an hdri
I mean thats not remotely the same thing at all. the background isn't emitting light which is at least half the purpose of the mandelorian setup.
thats an image plane backdrop which has been around in DS since the V3 era. (tbf those are quite a good looking version thereof)
the blurb even mentions
I'm just confused what feature specifically are you looking for that goes beyond an HDRI + a ground object of some sort
Why are you being so hostile?
HDRI's are only useful if the camera is placed "just so," which is difficult to achieve unless they've been specially made for Daz studio.
I'm asking if it is possible to replicate something like the virtual production setup the Mandalorian uses in order to be able to use simple JPEGs + fake "LED screens" to create plausible (not necessarily physically accurate) lighting, just like they do on the set of the Mandalorian. It could conceivably grant you the same advantages as it does with the television production: Fewer resources necessary to produce an expansive scene.
I suppose I'll give it a shot myself and see what happens.
She's not being hostile. It's just that your posts suggest that, despite your protestations to the contrary, you don't actually understand how the Mandalorian set works, or even why it exists. The backdrop and camera are effectively generating an HDRI in real time, using a computer running Unreal Engine with a full 3D environment. The whole purpose of the set-up is to make it easier to integrate physical actors and props into a 3D environment, which, as Jack has pointed out repeatedly, is not an issue for us who work entirely in 3D.
Agreed.
Because the cost of shooting on location is stupid expensive, unless you're in small-town America, and you don't have control over so many factors. I can't tell you how much time I've wasted waiting for a plane to go by for sound, or how many times I've had to bribe folks who started mowing the lawn to stop for an hour or so. Or daylight, once the sun moves a certain way, you can't match anything but doing it in a virtual realm you have full control over it.
I used Daz characters in my last feature film because we didn't have the money to hire background actors. You can't tell the difference if they're in the background.
What? The backdrop and camera are definitely not generating an HDRI in real time. They're literally just big emissives with pictures, which we can do with simple planes. That isn't an HDRI or even close.
And as I have pointed out repeatedly, sometimes that isn't feasible with limited GPU memory.
What I'm wondering is if we can do the opposite: More easily integrate 3D characters into a real environment. It would be something like IBL. I don't know if it will work. It was an idea.
See, now I'm not convinced you even understand what an HDRI is. The Mandalorian setup is not literally an HDRI, but it is the functional real-world equivalent. The image shown on the backdrop is always relative to the virtual camera, like an HDRI, which you can't do with a static image on a simple plane. Also, a plane could be set to emit light, but it wouldn't provide environmental lighting the way the backdrop does, since it wraps around the physical set.
HDRIs ARE IBL. The part that you still don't seem to understand is that the Mandalorian setup doesn't replace the need to render a 3D environment, because rendering a 3D environment is exactly how it works. If you're worried about limited GPU memory, you can try rendering your own HDRIs in DS, or you can render with Unreal Engine, as the Mandalorian set does, but trying to replace that with an image on a plane can't achieve the same results.
The images that appear on the StageCraft screens aren't static - they are real-time rendered in UnReal Engine, so there's no escaping the overhead of geometry, textures, and so on... The burden just isn't in the recording camera, but there are crazy powerful systems driving those screens. The camera itself is being tracked and real-time and the images rendered to the LED panels are adjusted to match that position, so the scenery elements have the correct parallax and so on. Only the part of the image that is in the field-of-view of the camera is rendered at full resolution. Everything outside of the camera's field-of-view renders at a lesser resolution to maintain the proper lighting from the LED panels (and reflections.)
You could get the same effect by somehow linking the camera in DAZ to the positioning of an external camera and then real-time compositing the two views. There's no alleviating the burden of rendering real-time if you need accurate parallax and reflections, however. If you don't, then a proper 360 image of a location could be mapped to a sphere in DAZ and rendered... but you might as well do that with an HDRI to get proper lighting. Without real-time rendering, all you end up with is the equivalent of stationary 360-degree VR.
TD
I've worked with Daz for many years and make a good secondary income from my work. I know what an HDRI is, my friend. There is no need to be condescending and rude. If you're not interested in engage in the discussion, then don't
Did you even read my first post? This is what I said:
So, obviously, I already know that is the difficulty. Like it's even impossible to overcome. But I don't use Daz for animations, so it's not a big deal to me. I'm more interested in the possibility of faking an HDRI-like effect.
You could easily create a fake "LED screen" that wraps around a virtual set and also emits light. You could also have a fake LED ceiling doing the same. Like in the Mandalorian and other virtual productions, the floor should be "real."
Again, I don't know if this would work. That is why I started the thread.
Of course I understand this. I am, in a way, suggesting the opposite of the Mandalorian: Real-life photos (like a backdrop) and a virtual stage which allows them to emit enough light to act as environment lighting to adequately light the scene. HDRI's are great, but notoriously difficult to use. There are very limited angles which work when placing your characters, especially if you want to see the floor.
But if you could use the fake LED system to emit light, and somehow automagically get the perspective, horizon line, etc. of the photos to match the camera, that might be interesting for those of us working with expansive scenes.
Like I said, I'll try it out and see.
I'm not suggesting using rendered scenary, but a real-world panoramic photo (something like this) mapped onto a virtual "LED screen" within Daz which emits light, the same way the LED screens do on the virtual stage in real life.
Essentially, I'm suggesting building the virtual stage within Daz, but using real-world photos instead of rendered environments.
This may not work at all. And even if it does, the lighting won't be anywhere near as accurate as an HDRI. But maybe it will be good enough to "fake it" for scenes where an HDRI is more difficult to work with because of the their limitations.
Most combinations of stuff can be made to work in the right situation. I have the feeling that most people are evaluating this idea as a general-purpose re-inventing-the-wheel type substitute for HDRI, though, and it sure seems like more trouble than it's worth. Even compositing would probably make more sense, since we're just talking about still images.