VR Goggles?

marblemarble Posts: 7,449
edited July 2019 in Daz Studio Discussion

Just looking for info, really, as this is a new concept for me but I've seen them advertised a lot recently. How do these VR goggles work and would they be any use for viewing IRay renders? Or is there a special render format? Are they 3D (as in stereoscopic rather than a 3D mesh rendered in IRay)?

I notice that there is a huge price range from a few dollars to hundreds - what would be OK for viewing DAZ renders (if it is possible)?

[EDIT] I did a search and found that the subject has come up before but the people discussing it seem to know the technology already. Here's an example:

https://www.daz3d.com/forums/discussion/300711/vr-support-in-daz-studio-a-long-awaited-thing

Post edited by marble on
«1

Comments

  • elbigguselbiggus Posts: 23
    edited July 2019

    When it comes to viewing DAZ renders in VR, it depends what you want. If you're just after a 3D scene that you can look around from a fixed spot then it's pretty simple: create two cameras placed about 6cm apart, give each a spherical lens distortion, render them out, and then just use any VR image viewing software; you can even use one of those $10 things you put a smartphone into and it'll work fine. If you want "proper" VR, though, allowing you to move around and view the scene from any angle or position, you're out of luck -- as a general baseline you'd probably want to aim for two 2000x2000 renders 90 times a second, so you need software that's built for real-time rendering.

    Modelling in VR is kind of neat, but it's a very different sort of thing (see here, here, and here for three different approaches to it) and highly unlikely to be coming to DAZ any time soon!

    Post edited by elbiggus on
  • marblemarble Posts: 7,449
    elbiggus said:

    When it comes to viewing DAZ renders in VR, it depends what you want. If you're just after a 3D scene that you can look around from a fixed spot then it's pretty simple: create two cameras placed about 6cm apart, give each a spherical lens distortion, render them out, and then just use any VR image viewing software; you can even use one of those $10 things you put a smartphone into and it'll work fine. If you want "proper" VR, though, allowing you to move around and view the scene from any angle or position, you're out of luck -- as a general baseline you'd probably want to aim for two 2000x2000 renders 90 times a second, so you need software that's built for real-time rendering.

    Modelling in VR is kind of neat, but it's a very different sort of thing (see here, here, and here for three different approaches to it) and highly unlikely to be coming to DAZ any time soon!

    Thanks. I can stop worrying that I'm missing out on something exciting then. wink

  • Silver DolphinSilver Dolphin Posts: 1,588

    If you want to play with vr I would suggest Unity not daz because studio is geared to 2d rendering not vr. Just move your 3d assets into Unity and set up your googles. I don't recommend working in this enviroment because vr is in it early steps and is not setup for long periods of time. Googles get hot with no ventilation, but the upside is you can walk around your creations. It would be great if there was plugin for vr in daz where your secondary video cards could run your vr googles and you could look around your scenes. The problem would be interaction with 3D objects in vr. Need a visual toolbox that you can use inside vr to manipulate vr 3d objects. It would also need a game engine renderer as daz only has opengl which does not even support shadows so it would look terrible vs unity is happens in realtime. It would be nice if evee in blender could be setup for vr. Maybe a future feature maybe blender ver 3.

  • marblemarble Posts: 7,449

    If you want to play with vr I would suggest Unity not daz because studio is geared to 2d rendering not vr. Just move your 3d assets into Unity and set up your googles. I don't recommend working in this enviroment because vr is in it early steps and is not setup for long periods of time. Googles get hot with no ventilation, but the upside is you can walk around your creations. It would be great if there was plugin for vr in daz where your secondary video cards could run your vr googles and you could look around your scenes. The problem would be interaction with 3D objects in vr. Need a visual toolbox that you can use inside vr to manipulate vr 3d objects. It would also need a game engine renderer as daz only has opengl which does not even support shadows so it would look terrible vs unity is happens in realtime. It would be nice if evee in blender could be setup for vr. Maybe a future feature maybe blender ver 3.

    This reflects some of the many "if only" subjects I've been reading lately. If only DAZ Studio had a real-time render engine like Eevee, for example. I don't think that many of those "if only" wishes will be fulfilled very soon though.

  • Silver DolphinSilver Dolphin Posts: 1,588
    marble said:

    If you want to play with vr I would suggest Unity not daz because studio is geared to 2d rendering not vr. Just move your 3d assets into Unity and set up your googles. I don't recommend working in this enviroment because vr is in it early steps and is not setup for long periods of time. Googles get hot with no ventilation, but the upside is you can walk around your creations. It would be great if there was plugin for vr in daz where your secondary video cards could run your vr googles and you could look around your scenes. The problem would be interaction with 3D objects in vr. Need a visual toolbox that you can use inside vr to manipulate vr 3d objects. It would also need a game engine renderer as daz only has opengl which does not even support shadows so it would look terrible vs unity is happens in realtime. It would be nice if evee in blender could be setup for vr. Maybe a future feature maybe blender ver 3.

    This reflects some of the many "if only" subjects I've been reading lately. If only DAZ Studio had a real-time render engine like Eevee, for example. I don't think that many of those "if only" wishes will be fulfilled very soon though.

    Not complaining, Daz studio is great for rendering stills and that is what it should focus on. I use DS for comic stills and it works great for that. If I need quick and dirty renders I just use Opengl and it works great. 3delight for massive scenes or night scenes and Iray only for closeups of main characters (without any background). I just composite in Photoshop. For VR I would look elsewere. I own two sets of VR Googles from Oculus and they are great! The first set of vr googles comes with cameras which are external and the new ones comes with cameras built in. I have some 3d creation software and vr painting but like I stated before it gets too hot and I do live in the far north of the US so it does not get very hot here. VR is still in early stages. But for games it is great. I like walking around and just looking at everything. VR is nice for gaming but not really ready for walk around modeling and painting until it becomes more comfortable.

  • nickalamannickalaman Posts: 196

    I've rendered a few images and stitched them together and viewed them in my VR googles. It can be breath taking, insteaed of looking at 27 inch monitor, imagine yourself in the scene. The bad news is that Daz does not make it easy, instead of letting you render a side by side image required for VR  (like octane does) we have to render 2 images and then glue them together in a second program. Iray has the functionality, it’s just that Daz has not implemented it.

  • marblemarble Posts: 7,449

    Looks like Casual might have a script for that soon. :)

    https://www.daz3d.com/forums/discussion/comment/4747816/#Comment_4747816

  • marblemarble Posts: 7,449

    I've rendered a few images and stitched them together and viewed them in my VR googles. It can be breath taking, insteaed of looking at 27 inch monitor, imagine yourself in the scene. The bad news is that Daz does not make it easy, instead of letting you render a side by side image required for VR  (like octane does) we have to render 2 images and then glue them together in a second program. Iray has the functionality, it’s just that Daz has not implemented it.

    I'm still not sure what the goggles do. It seems to me that we are talking about 3D Stereoscopic viewing, not VR as I imagine it. Indeed, Silver Dolphin mentioned walking around objects in a post above. I can't imagine how that can be done by stitching together two images or side-by-side renders. Perhaps I have the whole concept wrong?

  • Silver DolphinSilver Dolphin Posts: 1,588

    Mccasual script is for 2d 3d. Unity is a gaming enviroment so it is in realtime so vr.

  • mindsongmindsong Posts: 1,693
    edited July 2019

    As mentioned, but maybe worth some additional thoughts:

    The usual static VR360 (sphere) scenes are neat, but ... static. Still very compelling because these can also be viewed with a headset and most any phone that has a motion-gyro (most now). Both eyes typically see the same 2D view of the surrounding image sphere. Fun, compelling, and a neat way to see a pre-rendered world from a given perspective. These are sometimes animated, but basically a pre-rendered scenario (fixed location or moving through it) and you can look around as the animation on the 2D sphere unfolds. Think: IMAX dome theater using a 'pretty smart' headset.

    3D-Stereo has the well-known two distinct views (one for each eye), but in the VR360 realm, still suffers the problem that a set of static 360 L+R spheres (or static frame 360 L+R spheres animation sequence) cannot physically capture correct 3D-stereo perspectives in all directions from a single POV. If you think about it for a second, it'll make sense: from two eye-wide separated cameras at a single point in time - (both single frames OR each frame in an image sequence) the cameras will be properly aligned forward, overlapping/equal to your left/rights, and eyes-reversed when looking backwards...), so that's kind of a non-starter without some very clever distortion tricks (it *can* be done - with a few real stereo segments and good seam management, or many small stereo segments melded properly).

    True realtime scene rendering (VIVE or Oculus) as Silver Dolphin is enjoying, actually renders the scene in front of the viewer (depending on viewer location and viewing angle feedback) *as they are viewing*. This takes horsepower (and really good programming efficiency), so the only viable players in today's market are headsets that are attached to pretty hot machines with many-core graphics cards - not quite available to our iPhones or Galaxys at this point (getting there quickly tho'). This kind of scene can be done in true 3D-Stereo, but also works amazingly well in 2D, given the accuracy of the visual-cues as you move around in the virtual world - apparently 2D parallax is a big part of our visual processing engine, even though we have full 3D-stereo capable equipment in our noggins.

    DAZ Studio doesn't do this. Not even close. Love DAZ Studio for what it does well (2D renders (or sequences) of real 3D scenes), but don't disparage it for something that it was never intended to do. (think: bicycle vs schoolbus, and you're on the right track...)

    Take into account that any of these VR360 mechanisms may only be feeding you a normal TV 16x9 720p/1080p of video at a given moment, but they need to have 120-times (?) amount of video available instantly to your headset to cover any possible angle you might consider looking on a whim in your immersion. Then Stereo3D versions double that requirement. (240 channels of HD TV, all available to your little iPhone headset in realtime... Dynamic texture-mapping on steroids, eh?)

    I project we'll have something useable and affordable in this domain within 20 years, and it'll likely be on our phones or AR glasses, but we're pretty early on the road now, mostly due to hardware and bandwidth.  Exciting times though, and this (DAZ/Rendo/TurboSquid/G-warehouse, etc.) is where the content is going to come from. You will see DAZ content in many 3D-Stereo VR360 visions in the future, and I can only hope to 'walk' through some of the Stonemason worlds in my lifetime!

    The mentioned scripts from mcasual seem to hint at single/multi-frame classic 3D-Stereo scene rendering automation, with no 360-ish-ness at all (which is still super exciting to me!), although he already has another toolkit out that does work the VR360 angle using the 'segments' approach - within the DAZ Studio environment. Hey, maybe we can put these tools together and get a Stereo VR360 scene or two out of DAZ Studio! Does anyone know of a *3D-stereoscopic* VR360 viewer app?

    gonna be fun,

    --ms

     

    Post edited by mindsong on
  • marblemarble Posts: 7,449
    mindsong said:

    As mentioned, but maybe worth some additional thoughts:

    ...

    I project we'll have something useable and affordable in this domain within 20 years, and it'll likely be on our phones or AR glasses, but we're pretty early on the road now, mostly due to hardware and bandwidth.  Exciting times though, and this (DAZ/Rendo/TurboSquid/G-warehouse, etc.) is where the content is going to come from. You will see DAZ content in many 3D-Stereo VR360 visions in the future, and I can only hope to 'walk' through some of the Stonemason worlds in my lifetime!

    The mentioned scripts from mcasual seem to hint at single/multi-frame classic 3D-Stereo scene rendering automation, with no 360-ish-ness at all (which is still super exciting to me!), although he already has another toolkit out that does work the VR360 angle using the 'segments' approach - within the DAZ Studio environment. Hey, maybe we can put these tools together and get a Stereo VR360 scene or two out of DAZ Studio! Does anyone know of a *3D-stereoscopic* VR360 viewer app?

    gonna be fun,

    --ms

     

    Excellent summary - thank you very much.

    In 20 years I will be 88 (if I'm still around). Whether I'll be hacking away at DAZ Studio renders in VR is doubtful, however. Still, maybe I will live to see it.

    In the meantime, static 3D stereoscopic images, "IMax style" would still be helluva interesting to me so please add further comments on how we can produce these using the tools we have right now (preferably right out of DAZ Studio).

  • nickalamannickalaman Posts: 196

    I’m currently enjoying multiple games in VR with incredible graphics. Such as Star Trek Bridge crew, Arizona Sunshine, etc. There are  even a few programs that let you import Daz models and let you pose them in VR (they are adult only games). I have even imported some of Stonemason worlds and walked through them in VR. Can you do this on your phone, no, I’ve got a powerful system, that lets me do this in realtime, but so do most of us here that render in Iray. 

    So you want to do this today, download Unity, save any scene as an FBX learn a few thing in unity and import  your scene, and enjoy a virtual world of your creation. Problems, yes there are a few namely conversion of materials, hair, eyes, anything with trans maps.  

    Quality, unity has released an experimental build with with HDRP real time ray tracing that get’s close to Iray quality in real time.

     

    I just wish that Daz would dump the open G/L preview and switch to the Unity Game engine.

     

    I know some will say Daz was not designed for that, Daz was designed for still images. All I have to say is that the original Canon 5D full frame camera was designed for still photography, but with a little ingenuity it also became one of the best video cameras.

     

    On a personal note after discovering VR, I’m spending less and less time in Daz Studio. (And also less money)

  • marblemarble Posts: 7,449

    I’m currently enjoying multiple games in VR with incredible graphics. Such as Star Trek Bridge crew, Arizona Sunshine, etc. There are  even a few programs that let you import Daz models and let you pose them in VR (they are adult only games). I have even imported some of Stonemason worlds and walked through them in VR. Can you do this on your phone, no, I’ve got a powerful system, that lets me do this in realtime, but so do most of us here that render in Iray. 

    So you want to do this today, download Unity, save any scene as an FBX learn a few thing in unity and import  your scene, and enjoy a virtual world of your creation. Problems, yes there are a few namely conversion of materials, hair, eyes, anything with trans maps.  

    Quality, unity has released an experimental build with with HDRP real time ray tracing that get’s close to Iray quality in real time.

     

    I just wish that Daz would dump the open G/L preview and switch to the Unity Game engine.

     

    I know some will say Daz was not designed for that, Daz was designed for still images. All I have to say is that the original Canon 5D full frame camera was designed for still photography, but with a little ingenuity it also became one of the best video cameras.

     

    On a personal note after discovering VR, I’m spending less and less time in Daz Studio. (And also less money)

     

    A few points for me to consider there:

    • Adult games that allow DAZ character import ... well I'm an adult wink
    • My system is not so powerful ... Intel i7, GTX1070, 32GB RAM but it is over 3 years old.
    • I don't have the goggles and wouldn't have a clue what to buy or whether I could afford some. I'm guessing you spent a king's ransom on a pair or two?
    • Learning Unity is of interest but I do have other priorities. Perhaps not today then but sometime, sure. 
    • I agree about the ancient OpenGL viewport. I suggested, in another thread, that DAZ might consider Eevee but that didn't get much traction. I didn't think of Unity or Unreal because I know nothing about them as yet.
    • One question about game software: can you animate figures "by hand" or are you limited to a package of pre-packed motions - such as run, walk, jump, crouch, etc. I have to repeat, I've never been a gamer so I can only go by what I've seen my kids play on Playstation, etc.

     

  • nickalamannickalaman Posts: 196

     

    The ones i currently have are the Samsung Electronics HMD Odyssey+, they retail for $499, but usually can be found for about $350 on Amazon, I have a 1080ti and but it should run fine on a 1070, though you might not be able to hit 90 frames per second on some games.

    I learnt enough about unity in about 2 hours, to import a few scences and open them up in VR, there are some really great tutorials out there, just google daz to uni

    As far as animating, i purchaed iclone and 3d Exchange, I import daz figures in iclone, animate them and then export the animations back into unity. Iclone is much easier to animate in and includes many built in animations.

    if you'd like more info re Adults stuff surprise you can PM me

  • marblemarble Posts: 7,449

     

    The ones i currently have are the Samsung Electronics HMD Odyssey+, they retail for $499, but usually can be found for about $350 on Amazon, I have a 1080ti and but it should run fine on a 1070, though you might not be able to hit 90 frames per second on some games.

    I learnt enough about unity in about 2 hours, to import a few scences and open them up in VR, there are some really great tutorials out there, just google daz to uni

    As far as animating, i purchaed iclone and 3d Exchange, I import daz figures in iclone, animate them and then export the animations back into unity. Iclone is much easier to animate in and includes many built in animations.

    if you'd like more info re Adults stuff surprise you can PM me

    Ahh thanks ... as I thought, a little above my budget. Not quite a king's ransom but the iClone suite and quality goggles would do serious damage to my spending plans. Maybe the prices will drop or DAZ animation tools will improve or both. Learning Unity will still be on my to-do list though.

  • mindsongmindsong Posts: 1,693

    Like most folks here, I have ideas that I think might have commercial potential, but I really just like hacking at this all to see if I can get there.

    I hope you enjoy your road there too, and will keep this thread alive as you progress. We're a small but interesting club, we 3D/Stereo folk, and even my 1800s era 3D-stereo-cards still tickle me to no-end. Tricking my infallible senses... Magic... all of it. :)

    Yah, keep an eye out for mcasual's upcoming stereo-3d helper script(s). I think it's going to be a wonderful tool for those of us who don't/can't invest commercial sums, but still want to play!

    cheers,

    --ms

  • mindsongmindsong Posts: 1,693

    I’m currently enjoying multiple games in VR with incredible graphics. Such as Star Trek Bridge crew, Arizona Sunshine, etc. There are  even a few programs that let you import Daz models and let you pose them in VR (they are adult only games). I have even imported some of Stonemason worlds and walked through them in VR. Can you do this on your phone, no, I’ve got a powerful system, that lets me do this in realtime, but so do most of us here that render in Iray. 

    So you want to do this today, download Unity, save any scene as an FBX learn a few thing in unity and import  your scene, and enjoy a virtual world of your creation. Problems, yes there are a few namely conversion of materials, hair, eyes, anything with trans maps.  

    Quality, unity has released an experimental build with with HDRP real time ray tracing that get’s close to Iray quality in real time.

    I'm jealous of the HW and your knowledge, but you make a good sell. Tempting but intimidating (so much to learn!) path to take.

    I just wish that Daz would dump the open G/L preview and switch to the Unity Game engine.

    At least as an option... that'd be very cool!

    I know some will say Daz was not designed for that, Daz was designed for still images. All I have to say is that the original Canon 5D full frame camera was designed for still photography, but with a little ingenuity it also became one of the best video cameras.

    nice... I like that way of thinking!

    On a personal note after discovering VR, I’m spending less and less time in Daz Studio. (And also less money)

    lol - there's a sell, all by itself. May save the marriage too!

    :)

    --ms

  • marblemarble Posts: 7,449

    Just to update this thread. I did buy an Oculus Rift S and it is sitting here on  my desk and I'm not really sure what to do with it yet.  I looked at the adult games mentioned above and, while very impressive in terms of soft-body physics, cloth simulation and 3D VR viewing, there is clearly still a lot of development to do. The one I looked at supports older Genesis 2 figures but not G3 or G8 which is a pity. Otherwise, I spent a fair amount of time viewing Google Earth in VR  and that is quite awesome at first. Sooner or later you spot the tricks they use to render some of the landscape (woods, deserts, etc.) and it reminds me of old PC games - like golf - where the trees were popping up as you moved dow the fairway.

    I would like to get more familiar with Unity (the game I mentioned is Unity based) so that I can take my scenes from DAZ Studio into unity, buldings, rooms, props and all. All I have found by way of tutorials is how to get a Genesis figure/materials and textures into Unity but not whole scenes. Maybe a new project for mCasual??

    I have found the so-called "sweet spot" in the Oculus viewer very difficult to find so I'm a little disappointed with the image quality. I expected something similar to watching a 3D movie (with the glasses) in the cinema but, for me, the quality is not even close to that. My IPD is exactly 70 and perhaps that's a little wide for the Oculus.

  • I've been messing around with vr/stereo images for both daz studio and unity for several years now, albeit not for high end headsets like occulus -- I just have a humble google cardboard. :)  I've been toying with the idea of writing up some of my workflow and some of the things I've learned, if there's interest maybe I should actually do this... (though I'm sure there's a lot of good stuff online already).  Basically, whether it's vr or stereo, all I do is render an image to two different cameras (one for each eye), then combine them into a single frame in either photoshop or video editing software, then export that as an mp4 or mkv.

    Like I said I have no experience with occulus, but presumably there are apps you can get that let you view stereo and vr mp4 videos.  On android there's an excellent free app called VR Theatre for cardboard which lets me watch any mp4 video on my phone (both 2d and 3d) as if you're in a virtual cinema, and it can also play both 180 and 360 videos in a variety of formats (side-by-side, under/over, etc).  However, since the google cardboard is just a pair of lenses magnifying an image on your phone, it's fairly easy to view basic stereo images / videos without special software. 

    As far as unity goes, learning it will definitely take time, but it's definitely possible to get results fairly quickly as they already have a decent amount of tutorials and sample projects.  My own vr project simply started with attaching a vr camera on the rollerball tutorial, and I built it up from there.  I don't use this personally, but VRTK is supposed to be very good for getting a lot of the fundemental stuff built quickly, such as basic traversal and manipulating objects. Unity's preferred format for importing is fbx, which means it should be possible to import anything from DS (including figures with animation) into your scene.  However, the biggest challenge with VR (certainly mobile vr) is performance -- depending on your hardware, you might find it difficult to have a complete DS scene running at an acceptable framerate, since most content for DS is not built with performance in mind (e.g. each character has loads of different materials, which results in multiple draw calls, which is something you're trying to keep to a minimum).  With that in mind, it might make more sense to use something like ProBuilder (built into unity) for building environments and props, and just import characters from DS (even then, I suspect they might be too high poly).  For this reason I don't tend to use DS content in Unity projects, but I'm targeting mobile not desktop vr.

  • @SyntheticReflections I am interested in the workflow.

  • marblemarble Posts: 7,449

    I've been messing around with vr/stereo images for both daz studio and unity for several years now, albeit not for high end headsets like occulus -- I just have a humble google cardboard. :)  I've been toying with the idea of writing up some of my workflow and some of the things I've learned, if there's interest maybe I should actually do this... (though I'm sure there's a lot of good stuff online already).  Basically, whether it's vr or stereo, all I do is render an image to two different cameras (one for each eye), then combine them into a single frame in either photoshop or video editing software, then export that as an mp4 or mkv.

    Like I said I have no experience with occulus, but presumably there are apps you can get that let you view stereo and vr mp4 videos.  On android there's an excellent free app called VR Theatre for cardboard which lets me watch any mp4 video on my phone (both 2d and 3d) as if you're in a virtual cinema, and it can also play both 180 and 360 videos in a variety of formats (side-by-side, under/over, etc).  However, since the google cardboard is just a pair of lenses magnifying an image on your phone, it's fairly easy to view basic stereo images / videos without special software. 

    As far as unity goes, learning it will definitely take time, but it's definitely possible to get results fairly quickly as they already have a decent amount of tutorials and sample projects.  My own vr project simply started with attaching a vr camera on the rollerball tutorial, and I built it up from there.  I don't use this personally, but VRTK is supposed to be very good for getting a lot of the fundemental stuff built quickly, such as basic traversal and manipulating objects. Unity's preferred format for importing is fbx, which means it should be possible to import anything from DS (including figures with animation) into your scene.  However, the biggest challenge with VR (certainly mobile vr) is performance -- depending on your hardware, you might find it difficult to have a complete DS scene running at an acceptable framerate, since most content for DS is not built with performance in mind (e.g. each character has loads of different materials, which results in multiple draw calls, which is something you're trying to keep to a minimum).  With that in mind, it might make more sense to use something like ProBuilder (built into unity) for building environments and props, and just import characters from DS (even then, I suspect they might be too high poly).  For this reason I don't tend to use DS content in Unity projects, but I'm targeting mobile not desktop vr.

    Thanks, that all seems interesting and sharing your workflow would definitely be helpful to those of us who are new to stereoscopic viewing and VR.  As for the often mentioned many tutorials, either my search skills are letting me down or I am looking for the wrong thing.. I keep find tutorials about coding for games (Unity is a game engine after all) but very little about importing content, particularly DAZ content. I have seen forum posts discussing using Decimator or decimating in Blender but these don't describe the workflow as you mention. 

    Again, for stereoscopic images I have seen posts here describing setting up two cameras and using something like Stero Photo Maker to combine the images but no mention of what software is required to view these images in VR. There always seems to be a vital piece of information missing. 
     

  • RobinsonRobinson Posts: 751
    edited November 2019

    @SyntheticReflections I am interested in the workflow.

    Me too.  I've got an old Oculus DK II knocking about somewhere.   Would be fun to try it out.  The only thing I managed to do before was export a Daz character with some animation into Unity and view her in 3d there.  It was kind-of fun.

    Post edited by Robinson on
  • GlennFGlennF Posts: 141

    I would also be interested in your work flow. I only have a cheap headset like google cardboard.

  • davidtriunedavidtriune Posts: 452
    edited November 2019

    I posted a way to view things in VR, here https://www.daz3d.com/forums/discussion/comment/4564516/#Comment_4564516

    You can also render stereoscopic 360 photos by rendering one fisheye camera in stereo offset -20 mm and  +20mm (they don't go beyond that even though you can dial them up or down more). Then scale down the world by about 40% . Here are ones that I made. Tested using virtual desktop. It has an environment creator that lets you create 3d panaromas from 2 panoramic images.

    2b562be7d9c5201b9346f54d1e8829.jpg
    1536 x 1536 - 544K
    fe9f3005465e3fcddc013f481baafb.jpg
    3072 x 3072 - 3M
    Post edited by davidtriune on
  • marblemarble Posts: 7,449

    I posted a way to view things in VR, here https://www.daz3d.com/forums/discussion/comment/4564516/#Comment_4564516

    You can also render stereoscopic 360 photos by rendering one fisheye camera in stereo offset -20 mm and  +20mm (they don't go beyond that even though you can dial them up or down more). Then scale down the world by about 40% . Here are ones that I made. Tested using virtual desktop. It has an environment creator that lets you create 3d panaromas from 2 panoramic images.

    Thanks for that - I hadn't found your previous post but then this forum is not the easiest to search.

    Your stereo offset (+/- 20mm) - is that the equivalent to the IPD for VR? If so, mine is 70 mm which I would think I would need to set +/- 35mm, right? Also, I'm a bit confused when you say to render one fisheye camera. All the other methods I have read about require two cameras set to spherical lens (I think) and separated by the IPD value. I do need to try all this yet but have been so busy with other things and have not had the time.

    I do have Virtual Desktop - got it from Steam a few days ago - but have not viewed any stereo images yet. Have viewed a couple of the HDRI images (they need to be JPG, not HDR) and they are incredible. 

  • davidtriunedavidtriune Posts: 452
    edited November 2019
    marble said:

    I posted a way to view things in VR, here https://www.daz3d.com/forums/discussion/comment/4564516/#Comment_4564516

    You can also render stereoscopic 360 photos by rendering one fisheye camera in stereo offset -20 mm and  +20mm (they don't go beyond that even though you can dial them up or down more). Then scale down the world by about 40% . Here are ones that I made. Tested using virtual desktop. It has an environment creator that lets you create 3d panaromas from 2 panoramic images.

    Thanks for that - I hadn't found your previous post but then this forum is not the easiest to search.

    Your stereo offset (+/- 20mm) - is that the equivalent to the IPD for VR? If so, mine is 70 mm which I would think I would need to set +/- 35mm, right? 

    it's limited to -20 and +20 even though you can dial it more than that. So if you dial -30, the render will still be -20. That's why I said you have to scale down the world by 40% to compensate, because 40mm is about 60% of our IPD, about 65mm.

    Although this was on daz 4.11, not sure if fixed in 4.12

    Also, I'm a bit confused when you say to render one fisheye camera. All the other methods I have read about require two cameras set to spherical lens (I think) and separated by the IPD value. I do need to try all this yet but have been so busy with other things and have not had the time.

    Yes one camera because it is a "faux 3d" method for 360 degree images. otherwise you would need to render hundreds of images for a 360 image. (The method is called "Omni­directional stereo (ODS) projection"). 

    Post edited by davidtriune on
  • marblemarble Posts: 7,449
    edited November 2019
    marble said:

    I posted a way to view things in VR, here https://www.daz3d.com/forums/discussion/comment/4564516/#Comment_4564516

    You can also render stereoscopic 360 photos by rendering one fisheye camera in stereo offset -20 mm and  +20mm (they don't go beyond that even though you can dial them up or down more). Then scale down the world by about 40% . Here are ones that I made. Tested using virtual desktop. It has an environment creator that lets you create 3d panaromas from 2 panoramic images.

    Thanks for that - I hadn't found your previous post but then this forum is not the easiest to search.

    Your stereo offset (+/- 20mm) - is that the equivalent to the IPD for VR? If so, mine is 70 mm which I would think I would need to set +/- 35mm, right? 

    it's limited to -20 and +20 even though you can dial it more than that. So if you dial -30, the render will still be -20. That's why I said you have to scale down the world by 40% to compensate, because 40mm is about 60% of our IPD, about 65mm.

    Although this was on daz 4.11, not sure if fixed in 4.12

    Also, I'm a bit confused when you say to render one fisheye camera. All the other methods I have read about require two cameras set to spherical lens (I think) and separated by the IPD value. I do need to try all this yet but have been so busy with other things and have not had the time.

    Yes one camera because it is a "faux 3d" method for 360 degree images. otherwise you would need to render hundreds of images for a 360 image. (The method is called "Omni­directional stereo (ODS) projection"). 

    I looked up ODS and found this description (it is about videos but I would imagine the same applies to static images):

    http://www.leadingones.com/articles/intro-to-vr-5.html

    I found the opening couple of paragraphs interesting in view of what you were saying about the single camera.

    The main difference between 360 and VR videos is that VR videos are stereoscopic (have a depth component to the image). 360 videos or 360 monoscopic videos can be captured or rendered using one spherical camera or a single camera on a rig. They are said to have a “spherical” view of a scene.

    To add the aspect of depth in VR videos, they have to be captured/rendered with two cameras-one for each eye. In real life, the distance between your eyes cause a slight change between the two images that each individually see (the view from your left eye is slightly shifted from the view from your eye). This disparity is what causes the perception of depth when you look at the world around you.

    Post edited by marble on
  • davidtriunedavidtriune Posts: 452
    edited November 2019

    That is talking about monoscopic scenes. ODS is only used for 360 degree scenes, you can just use 2 cameras for 180 degree scenes. 

     

    Here's a good article on it

    https://developers.google.com/vr/jump/rendering-ods-content.pdf

    Post edited by davidtriune on
  • Hi all this is a good thread Marble. I have a similar interest in that I would like to be able to view Daz scenes in VR. Although, my ambitions are not real high. I do not need to animate. I would be happy just to see a static room scene in VR and at the most be able to move around in it. 

    My question is how well do Daz assets transfer to Unity? From what I can tell there is no problem with the mesh but how well do things like Iray shaders and morphs transfer over? Below is a thread that gives some examples that look promising.

    https://www.daz3d.com/forums/discussion/311406/contents-in-unity-game-engine#latest

    Obviously you are rendering in a different environment so I would imagine lights would have to be redone. How much needs to be reworked for a complete scene that was originally put together in Daz Studio?

  • marblemarble Posts: 7,449
    scot60656 said:

    Hi all this is a good thread Marble. I have a similar interest in that I would like to be able to view Daz scenes in VR. Although, my ambitions are not real high. I do not need to animate. I would be happy just to see a static room scene in VR and at the most be able to move around in it. 

    My question is how well do Daz assets transfer to Unity? From what I can tell there is no problem with the mesh but how well do things like Iray shaders and morphs transfer over? Below is a thread that gives some examples that look promising.

    https://www.daz3d.com/forums/discussion/311406/contents-in-unity-game-engine#latest

    Obviously you are rendering in a different environment so I would imagine lights would have to be redone. How much needs to be reworked for a complete scene that was originally put together in Daz Studio?

    I have the same questions. I have installed Unity and have watched a couple of tutorials to find my way around the interface but many of the tutorials are aimed at coders who want to get into game creation which, obviously, is the main purpose of the Unity engine. So there is a different emphasis to that which I am used to with DAZ Studio. For example, I found a tutorial about importing clothes from DAZ Studio to Unity but again, coding was necessary to get the clothes to stay on the figure when it moves. I don’t know about IRay materials yet but they don’t seem to export too well to other platforms such as Blender so I am not very optimistic about Unity either.
Sign In or Register to comment.