Can someone build a 360 panormaic camera for use in DAZ (for animation) I'll pay

Luv LeeLuv Lee Posts: 230
edited October 2015 in The Commons

I ask becasue I have a VR project and I only use daz as a platform. I can't afford the more expensive platforms and tools like blender are bear to set up and use (besides, I don't like the way cycles looks, no offense to users, just a perosnal preference). Since, I wll be animating and not creating stills, I really, really need a simple set up process..this persons wants stuff produces rather quickly.  I  can't seem to find a viable solution. If one cannot be built, is there a way to set up existing cameras--like, set camera around the parameter and like "stitch them together" so they all work in conjustion with another to create equirectangular effects?  Any help would be most welcome as I am desperate.

 

Thank you,

 

Alicia

Post edited by Luv Lee on

Comments

  • mjc1016mjc1016 Posts: 15,001

    First, which renderer?

     

  • erik leemanerik leeman Posts: 262
    edited October 2015

    As I've demonstrated before in the Octane forum there is a way to do this in Octane for DAZ Studio, but it's not particularly quick and easy.

    Basically it works like this: you can render sets of 90x90 degree cube faces (six per set) by setting the appropriate camera parameters, then 'stitch' (or rather reproject) those cube faces into equirectangular images, and turn those into a 360x180 degree video sequence.

    For the stitching bit you can use several programs like Hugin, PTGui, or Pano2VR.

    Whichever tools you choose, it's a LOT of work, and hardly suitable for animation work.

    I used Pano2VR for this 360x180 degree equirectangular (it was just a test), the bottom image is one of the original six cube faces

     

     

     

    grass_equirect_1920x960.jpg
    1920 x 960 - 317K
    cube_face_front.jpg
    1024 x 1024 - 1000K
    Post edited by erik leeman on
  • Oso3DOso3D Posts: 15,084

    Carrara is often only $60 on sale, and has a spherical camera, if that helps.

     

  • grinch2901grinch2901 Posts: 1,247

    I haven't had any success using Carrara's spherical camera. everything gets all warped and distorted when I assign the resulting textures to the dome in Iray. Not suitable for background images for sure. Is there a secret to making it work properly?

  • I use a simple plane from the DS primitives; When looked directly at from above, with the x-axis to the right its uv-mapping is like this:

    s runs from left to right, t runs from top to bottom

    envmap-01.png
    350 x 350 - 6K
  • An equirectangular mapping, like one that is used in most environment maps (you can find plenty on the internet) has a center (the direction to where the camera was pointing) and 360 degrees in the horizontal axis and 180 degrees on the vertical axis. In mathematical terms this is from -pi to pi on the x-axis, -pi/2 to pi/2 on the y-axis.

    envmap-02.png
    350 x 350 - 9K
  • So the first i do is mapping the DS-primitive square to the equirectangular square. First the x-axis (the longitude): Open shader mixer and choose the "Variable[Fixed]" brick set it to 's' (which is the s-texture coordinate.

    envmap-shot-s01.png
    430 x 603 - 9K
  • This value runs from 0 to 1 when rendered. I want it to be the longitude, so it should run from -pi to pi. I.e. i want to transform it using

    s->(2s - 1) Pi

    So i use a Mathematic/Binary Operation Brick from the brickyard, set it to multiply put the s into it and set the other value to 2.

     

    envmap-shot-s02.png
    430 x 603 - 21K
  • BTW: when creating a new brick, first thing i do is to change the title to reflect what it calculates (so i set it to '2s') after it is connected, i hide the inputs. The display does not get cluttered too much this way.

    I use another binary operation brick to subtract 1 from the 2s. I let the operation at the default 'Add' and simply set the value to -1, Just a habit (there is also a subtract).

    envmap-shot-s03.png
    430 x 603 - 24K
  • Use the 'Variable[Root Context]' brick get Pi into the system.

    envmap-shot-s04.png
    430 x 603 - 19K
  • I use another Mathematic/BinaryOperation to multiply the Pi and the '2s-1' with each other. The resulting value is (2s-1)Pi and represents the longitude.

    envmap-shot-s05.png
    453 x 603 - 32K
  • Since i am basically done with the longitude, i select all those bricks, choose from the right-mousebutton-menu "Group selected" bricks and can rename the resulting group to "lon" (for longitude). After that i hide the parameters of this group, so this whole bunch of bricks is nicely stuffed into a small rectangle labeled "lon".

     

    envmap-shot-s06.png
    430 x 603 - 10K
  • Next comes the latitude, which uses the t-variable. The scheme is basically the same as with the longitude.

  • This is getting lenthy, so i make it shorter for the latitude. The to map the 0-1 range from the t to the -pi/2 to pi/2 range of the latitude i calculate (0.5 - t)*Pi with some BinaryOperation bricks. Note that the t is inverted (multplied by -1), because the t has its minimum on the top of the plane, but i want the minimum (the -pi/2) of the latitude at the bottom.

    envmap-shot-s07.png
    430 x 603 - 8K
    envmap-shot-s08.png
    430 x 603 - 25K
  • Now it gets slightly more interesting: an equirectangular mapping maps the longitude/latitude to a 3-dimensional sphere. There are many ways to do that, but they all use the sine and cosine functions to transform from the 2-dimensional parameter space to the 3-dimensional (orthogonal) image space. So i use a Mathematic/UnaryOperation brick to calculate the sine from the longitude, name this node 'sin(lon)'

    envmap-shot-s09.png
    430 x 603 - 21K
  • Same thing for the latitude. And again with the Cosine instead of the sine. The resulting nodes are named 'sin(lon)', 'sin(lat)', 'cos(lon)', and 'cos(lat)'. What i have now is 4 values that can be combined into a 3d-vector on a sphere. The first one is easy: the y-direction (up-vector) is at the top of the map (regardless on which side it it). I use a Value-brick and connect the sin(lat) output to its input. The Value-brick does actually nothing. Its use is only to get the value another name and to have the value in another place in the shadermixer-window, so i do not have to scroll around so much.

    envmap-shot-s10.png
    524 x 603 - 31K
  • Second the x-direction: Since i want the center of the image to be in line with a camera, x needs to be 0 for the longitue 0. So i use the sine of the longitude for the x-direction. However the size of the x-direction depends on the latitude (if you are standing on the north-pole, you are not moving much around). So it is multiplied by the cosine of the latitude.

    envmap-shot-s11.png
    524 x 603 - 43K
  • The z-direction (forward/backward direction) is probably the most complicated in all common environment maps. You never know if the direction the camera was pointing to is the positive or the negative z-direction. With the x-direction manking has mostly accepted that x points right, but the issue if z points forward or backward is still remaining unresolved. In my case the z direction is pointing backward (DS's blue arrow is pointing towards me when the red arrow is pointing right), so "forward" means "negative z-direction". In the center of the equirectangular map are longitude and latitude both zero, so both their cosines are 1, so the product of the cosines of latitude and longitude point into the negative z-direction.

    envmap-shot-s12.png
    524 x 603 - 50K
  • Invert this last value to get a positive z. By multiplying it with -1.

    envmap-shot-s13.png
    524 x 603 - 54K
  • So far what have i got? I have some shader bricks that use the s and t values to give me 3 other values x,y, and z, which somehow represent a position on a sphere. From the center of a sphere, this could be used as a direction in 3d-space. To use it as a direction or a point, i combine the x,y, and z using a Geometric/Point node.

    envmap-shot-s14.png
    524 x 603 - 30K
  • I named that node 'direction(ws)'. 'ws' for 'worldspace', because it is based on the origin of the world coordinate system (i.e. based on 0). The renderer performs practically every operation in camera space, so i use an NTransform (brickyard: Functions/Geometric/Transformation/NTransform) to convert it into camera space by setting the "From" to "world" and the "To" to "current" ("current" is the same as camera-space in 3delight). The NTransform in contrast to the Transform (no N) is used whenever i want to use the value as a direction instead of the position (not only for normals, like the name would suggest).

    envmap-shot-s15.png
    524 x 603 - 39K
  • So far i have a point on a (virtual) sphere at the origin in 3d-space, encoded as a direction in camera space. I do not have the origin, however. I know that the origin is at (x,y,z) = 0,0,0, but that's woldspace. 3delight wants it in current space. So i create a point by using the Utility/Value brick, and leaving all values at 0. Name it 'origin(ws)'.

    Like with the direction, this also needs to be in camera space, so i use a Transform node to bring it into camera space. Just like the direction before. This time with the 'Transform' node (no N), because it is a position, not a direction.

    envmap-shot-s16.png
    524 x 603 - 36K
    envmap-shot-s17.png
    524 x 603 - 39K
  • millighostmillighost Posts: 261
    edited October 2015

    So far i have a point (the origin) and a direction (a point on a sphere around the origin). Now i want the renderer to go to that origin and look into the direction and tell me what it sees there. This can be done with the Trace brick (brickyard: Lighting/Raytrace/Trace):

    (To get the R and P input of the Trace i had to enable the "show hidden parameters" of the Trace)

    envmap-shot-s18.png
    524 x 603 - 45K
    Post edited by millighost on
  • Finally connect the color output of the Trace brick to the Color input of the Surface brick. ShaderMixer creates one by default which i kept, dragging it around, but there is also one in the brickyard at Roots/Surface.

    I left the "Shadow-Pass" on, but generally it  is probably not useful in this case.

    envmap-shot-s19.png
    524 x 603 - 32K
  • Fixme12Fixme12 Posts: 589
    edited October 2015
    Luv Lee said:

    I ask becasue I have a VR project and I only use daz as a platform. I can't afford the more expensive platforms and tools like blender are bear to set up and use (besides, I don't like the way cycles looks, no offense to users, just a perosnal preference). Since, I wll be animating and not creating stills, I really, really need a simple set up process..this persons wants stuff produces rather quickly.  I  can't seem to find a viable solution. If one cannot be built, is there a way to set up existing cameras--like, set camera around the parameter and like "stitch them together" so they all work in conjustion with another to create equirectangular effects?  Any help would be most welcome as I am desperate.

     

    Thank you,

     

    Alicia

    Hi,

    why not give unity a try personal edition (it's free)
    and digitaltutors have a great tutorial about VR and unity.
    http://www.digitaltutors.com/tutorial/2227-Getting-Started-with-VR-in-Unity

    Post edited by Fixme12 on
  • To use that shader i create a simple test scene. I use 6 colored cubes, and place them around the center of the scene, by moving them 3 meters into each direction. The center (0,0,0) must remain free of objects.

    The most important part is the plane i create (from the DS-primitives). I place it so that it cannot be seen from the center of the scene (in this case it is behind the red cube). It is useful to make this plane twice as wide as it is high. Also i create a camera that directly looks at this plane. It does not matter much what kind of camera that is (orthographic or perspective will all work). But it should look straight on the plane and the plane should fill the view of the camera closely. Easiest to do this is by giving it a 90degree field of view. Set the frame width to 100mm and the focal length to 200mm. Make the plane 200cm wide and place the camera 200cm before it, for example.

    envmap-shot-s21.png
    524 x 603 - 62K
  • millighostmillighost Posts: 261
    edited October 2015

    Now apply that built shader to the plane and render through the camera. When rendering that plane, the renderer will evaluate the shader, find the trace-brick, and trace the scene through the equirectangular map i set up, which will generate an environment map. This is generally how different camera models than point-projection and orthographic are done in 3delight or renderman. To adjust the image quality try to adjust the number of samples and the cone angle of the trace brick.

    I have uploaded a shader preset for that shader: https://sites.google.com/site/millighostmix/home/shader-mixer-envgen . To be paid for by LuvLee, free for everyone else :-)

    envmap-shot-render.png
    400 x 200 - 6K
    Post edited by millighost on
  • I almost forgot to mention: The resulting environment map is usable for lighting and reflections for virtually every renderer in the world, EXCEPT 3delight (or renderman). Those use a slightly different mapping, When using this kind map in a DS ShaderMixer shader it will still work (because shadermixer does some transformations), but it will not work when used for lighting with UberEnv2 (which uses the original renderman mapping). 3delight and renderman use the z-axis for the latitude, not the y-axis (as most others do). I leave the necessary change as an exercise for now...

  • GSSEVGSSEV Posts: 32

    Hi, Does this work in DS 4.8 with iray ? I could not get it to load.It came up in error.

    Would be great if it worked.

  • argel1200argel1200 Posts: 760

    millighost, any chance you could upload an Iray ready scene (e.g. the one with your cubes)?

Sign In or Register to comment.