Rendering 360 degree Cubemaps in Daz Studio

Ivo ShandorIvo Shandor Posts: 74
edited June 2016 in The Commons

I am wondering if anyone has already created a camera rig for rendering cubemaps inside of Daz Studio. I have searched for it but didn't come up with anything so far. I recently discovered that it can be done using just Daz Studio and allows you to then create fully immersive 360 degree panoramic photosphere textures that are 3D stereosopic images for use in Oculus Rift. Here's the story of how I discovered that and why I am thinking it would be worth the time to create a rig to automate the process.

I recently received my Oculus Rift after having purchased it many months ago. The Oculus store doesn't have very many apps right now but one of the free apps is created by Oculus that is called "360 Photos." Within that app, there is a category for 3D CGI-rendered stereoscopic and mono photos. I noticed that this entire category was sponsored by and had many images created by Otoy.com. They said they were either rendered in OctaneVR or an Octane plug-in. I saw that there is an Octane plug-in for Daz Studio but through research on their forums and using their demo version of the plug-in, I see that the Daz Studio plug-in doesn't support the camera that is used to create the 3D panoramic renders.

I went into their gallery section and downloaded a few renders. I noticed that all of the renders were just cubemaps and not a distorted photosphere render. So, I thought, why couldn't Daz just render that straight out of either iRay or 3Delight. I have attached one of their gallery images. So, I wanted to test a theory. I reasoned that if I just positioned a camera in a scene and set up the angle of the lens in the correct way, then I could make renderings from a camera looking left, right, front, back, down, and up. I set up the camera to have a frame width of 180mm and a focal length of 90mm just as a quick guess since the resulting renders would need to see 90 degrees from each view.

To my surprise, it worked on the first try. I rendered Urban Sprawl 3 very quickly by just manually turning the camera in 90 degrees increments and rendering for about 5 minutes each. To my surprise when I composed the photos in the correct sequence in a photo editor, I could view the entire scene in Oculus Rift on the first try without any seams on the cubemap. I did only a mono image to start with. To do stereoscopic, I just have to move the camera an eye width apart and render again in each direction. I have attached that test render as well.

So to sum up here are the details of what I am going to try to automate through creating a full camera rig with a system of 12 cameras and using a batch render script: http://www.daz3d.com/batch-render-for-daz-studio-4-and-rib

Set up cameras

  • Focal Length: Anything you want (I set it to 45mm)
  • Frame Width: Set to double (multiply by 2) what you set for Focal Length (I set this to 90mm)

Render at 90 degree angles for 6 renders:

  • Front
  • Back
  • Left
  • Right
  • Up
  • Down

Set up 6 camera angles at an eye width apart from the first 6 cameras for stereoscopic renders.

Stich together in an image editor (Photoshop, etc.). Sequence is from left to right:

  • Left Render - flipped horizontally
  • Right Render - flipped horizontally
  • Up Render - flipped vertically
  • Down Render - flipped vertically
  • Back Render - flipped horizontally
  • Front Render - flipped horizontally
  • Repeat for second view for steroscopic images

I don't know why you have to flip the images but you do otherwise you will be viewing all of the text in your textures backwards inside of the Oculus Rift. My guess is that the images are rear projected into the 3D cube that you are viewing inside the viewer app.

 

 

pano_small.png
2400 x 200 - 1M
test_small.png
1200 x 200 - 556K
Post edited by Ivo Shandor on

Comments

  • DigiDotzDigiDotz Posts: 508
    edited June 2016

    Interesting Ivo. I wont have occulus for a year or two - but would be interested in seeing a youtube of the results

    Post edited by DigiDotz on
  • FirePro9FirePro9 Posts: 455

    I just recently was looking at PLAYCANVAS (cross-platform game engine) and noticed they use cubemaps in their skyboxes.  I immediatley started wondering if I could use DAZ to create my own cubemaps.  Thank you very much Ivo for sharing this info.

    By the way, no need for Occulus Rift to view these images.  Here is a link to an article about viewing Cubemaps in a browser window (works pretty well in my Chrome browser), and when I opened the same images on my smartphone and attached it to my Google Cardboard device (Viewmaster VR)  the stereo VR worked very well too.  Very cool for being so relatively simple.

    https://labs.chaosgroup.com/index.php/digital-design-league/viewing-vr-stereo-cubemaps-in-a-browser-and-google-cardboard/ ;

    Here are direct links to a couple of the images in the above article:

    http://wip.sbrusse.com/BB_CubeMap/

    http://wip.sbrusse.com/KM_CubeMap/

    Maybe a PA will put together a 12-camera rig for making stereo cubemaps, though as Ivo has shown, this does not look to hard to setup yourself.

     

     

  • DigiDotzDigiDotz Posts: 508

    Well, once the Cameras are set up you could use CamSeq script from mcasual to sequence different render/camera views and then "render to image series"

     

  • Ivo ShandorIvo Shandor Posts: 74
    edited June 2016

    Thanks DigiDotz, for the link on the CamSeq script from mcasual. I tried it and maybe I am not using it right but it seemed to just sit after I set it up. Maybe it doesn't work with the latest Daz Studio or iRay. I will keep working on it.

    I updated my post above because I realized I had the "Frame Width" that I actually used wrong. I have also been experimenting and it looks like you can use any "Focal Length" you want so long as you have your "Frame Width" set to twice that value. That keeps the renders seamless. I am assuming that the effect of changing the focal length will just change how large the objects seem when you are viewing the final cubemap. I have seen a couple of examples in the Oculus app where the focal length might be too high and it gives the appearance of seeing from an ant's pespective.

    Post edited by Ivo Shandor on
  • FirePro9FirePro9 Posts: 455
    edited June 2016

    Completed my first cubemap rendered in DAZ and deployed on PLAYCANVAS using their VR Starter Kit.

    If you use a smartphone you can use your gyro to look around or launch Google VR/Cardboard.  

     

    Post edited by FirePro9 on
  • Ivo ShandorIvo Shandor Posts: 74
    edited June 2016

    I don't mean to keep resurrecting this thread. This was primarily meant for those people searching for a means to render with Daz Studio to some of the new VR options that are out there. I found no threads dedicated to it and most didn't offer rendered 360 panospheric photos that didn't refer to outside software. So, that being said, I was able to quick make a mono (non-3D) full cubemap rather quickly with the settings that I have shown in this thread.

    The thing that has eluded me is the stereoscopic 3-D vision that the Oculus Rift software allows. They allow for 2 separate cubemaps and I first thought that I could just render 2 of these cubemaps at roughly an eye-width apart which appears to (0.8) in DAZ scale. So I set the left eye at (0.4) and the right eye at (-0.4) but that yields mixed results. In order to keep the view from each eye seamless, the camera must start with the same origin point in space. But as you move from left, right, to looking behind you, the orientation of your eyes can't rotate like that

    So you maintain accurate 3D stereoscopic position but only in the front, top, and bottom. As you look left and right, the stereoscopic effects lessen until ultimately you see cross-eyed when you look backwards.

    I tried several camera rigs but ultimately, I came to the conclusion that the DAZ team will need to create a new type of camera for these types of renders. It will either be a version that uses NVIDIA Ansel technology to render 360-degree screenshots or a camera that is set up for 360 degree off-angle renders. It looks like the best solution is to have a left and right camera that can be set to look straight ahead but render off-axis to look at a single point in space the way that our eyes do.

    For now, I am going with the technique of rendering 2 seamless cubemaps that spin around. It seems more important to have seamless cubemaps. You can't look backwards but at least the front, up, and down look correct and seem three dimensional.

    Reference articles:

    http://elevr.com/cg-vr-1/

    http://paulbourke.net/stereographics/stereopanoramic/

    http://www.tokeru.com/cgwiki/?title=HoudiniOculusWip

    http://blog.dsky.co/capturing-virtual-worlds-a-method-for-taking-360-virtual-photos-and-videos/

     

     

     

    Figure-3.00011.gif
    512 x 256 - 492K
    Post edited by Ivo Shandor on
  • FirePro9FirePro9 Posts: 455

    It appears that stereo VR is quite a bit more tricky than what I would have guessed.  Certainly VR will become more common and hopefully some clever person will provide us DAZ users with a nice solution, maybe a plugin similar to Domemaster3D.

    http://www.andrewhazelden.com/blog/2014/10/render-spherical-stereo-content-with-the-domemaster3d-v1-6-alpha/

     

  • FirePro9FirePro9 Posts: 455
    DigiDotz said:

    Well, once the Cameras are set up you could use CamSeq script from mcasual to sequence different render/camera views and then "render to image series"

     

    You can do a 6-frame animation, each frame the camera needs to be rotated.  I tried rotating the camera directly but DAZ does not seem to like that.  Instead I created a primitive, rotated it in each of the 6 frames, saved as an aniblock.

    Parent your camera to the primitive and render animation to image sequence.  Works well making cubemaps.

  • mindsongmindsong Posts: 1,693

    Given the 'very-alive' nature of this technology, I don't think necro-ing this thread is unwarranted.

    For reasons very well articulated above (@Ivo Shandor), two static point source spherical images cannot be truly stereoscopic in all directions (at least not without intentional distortion/postwork).

    That said, I believe one path to a viable stereoscopic vr360 imagery is through the use of depthmaps. (6 frame cube images can work this way too).

    (very cool stuff going on here!)

    --ms

  • Leonides02Leonides02 Posts: 1,379

    Hmmmm... Has anybody gotten something like this to work? I have Octane, and rendering a stereoscopic spherical image is a snap. There's an option I click in the spherical camera and, bam, it's pretty much VR ready. 

    Iray has a spherical camera now, but doesn't seem to have the same functionality. Anybody know why or how to rig one up?

    I'd use my Octane, but I hate what it does to Iray materials... 

    Thanks.

     

  • FlumFlum Posts: 0

    @Leonides02

    A little late for the party - I am, but I found this thread looking for something similar, so...

     

    To view your DS work in VR/3D:

    - Create a camera. Set transforms to 0, except for Y-translate. This should be your eye height, like 170 (cm). Important! The camera has to face horisontally parralel to the z-axis (looking straight into the scene).

    - Set Camera/Focal Lenght to 50 mm (in the parameters tab), and Lens/Lens Distortion Type to "Spherical".

    - Group your scene under a null and move it in front of the camera. Don't move the camera!

    - Render 2:1 aspect ratio in high res! (I render 8000x4000).

    - Duplicate the cam,, move it 6.5 cm (or the distance between your eyes) to the right at the x-axis and render.

    - Save both images.

    - In your favorite pixel modifyer (like Photoshop), make a new square image with the x-axis dimension from your render (eg. 8000x8000) and place your 1st (left eye) render in the top half, and your 2nd (right eye) render in the bottom half.

    - Save the combined image, and watch it in a Vr image viewer (like Virtual Desktop).

    I've attached a low res example of what the final composition should look like.

    This method has the limitations of sperical mapping, but can create some pretty awesome results.

    Happy rendering!

    moto.jpg
    1000 x 1000 - 74K
  • Leonides02Leonides02 Posts: 1,379

    @Flum I literally just got back to testing this now, more than a year later. 

    Works like a charm. Thank you!

    So if anyone wants to render their images in VR, this is how you do it!

  • Flum said:

    @Leonides02

    A little late for the party - I am, but I found this thread looking for something similar, so...

     

    To view your DS work in VR/3D:

    - Create a camera. Set transforms to 0, except for Y-translate. This should be your eye height, like 170 (cm). Important! The camera has to face horisontally parralel to the z-axis (looking straight into the scene).

    - Set Camera/Focal Lenght to 50 mm (in the parameters tab), and Lens/Lens Distortion Type to "Spherical".

    - Group your scene under a null and move it in front of the camera. Don't move the camera!

    - Render 2:1 aspect ratio in high res! (I render 8000x4000).

    - Duplicate the cam,, move it 6.5 cm (or the distance between your eyes) to the right at the x-axis and render.

    - Save both images.

    - In your favorite pixel modifyer (like Photoshop), make a new square image with the x-axis dimension from your render (eg. 8000x8000) and place your 1st (left eye) render in the top half, and your 2nd (right eye) render in the bottom half.

    - Save the combined image, and watch it in a Vr image viewer (like Virtual Desktop).

    I've attached a low res example of what the final composition should look like.

    This method has the limitations of sperical mapping, but can create some pretty awesome results.

    Happy rendering!

    Thanks for the tip, a short test already looks very good!

Sign In or Register to comment.