UberEnvironment2 - Can Anybody Explain The Result Of This Simple Test? [UE2 problems/fixes]

3dcheapskate3dcheapskate Posts: 1,167

SUMMARY OF WHERE WE'VE GOT TO AFTER 9 PAGES OF POSTS

Bug report 152637 filed by Takeo on 13th October (see post 114)

'ketthrove's fix has been approved as a workaround (post #117) - but what in tarnation IS ketthrove's fix?

UE2 IBL Transformer is a workaround for the known offset errors

Maybe fixing UE2 isn't the right way to go? Maybe a new HDR environment built from scratch is what's required? (see page 8/9 of this thread)


ORIGINAL RESULTS HAVE BEEN EXPLAINED...
1) You MUST NOT use a JPG image for the UE2 IBL (Parameters > Light > Basic > Color). You need to have a correctly prepared TIF (Read at the bottom here http://www.omnifreaker.com/index.php?title=UberEnvironment )
2) There are known offset errors in the UE2 - theUE2 IBL Transformer seems to fix them, or manually rotating the Worldball (both xRot and yRot to -90) and environment sphere (not sure of the value - I never use it).
3) Alternatively reorient the TIF image that you plug into the UE2 light as per posts 50/51 and you don't need to rotate UE2. (I'm ignoring the Environment Sphere again)
These three things resolved my original problem and UE2 seems to work well enough for me now
BUT THERE ARE STILL OTHER STRANGE THINGS HAPPENING IF YOU TRY USING PROPER HDR IMAGES FOR THE LIGHTS, and from page 3/4 onwards this thread deals mainly with them


(confused by the different mappings? check this thread - Test Images (and more) For IBL and Environment Maps )

ORIGINAL OP STARTS BELOW:

Ever since my first tests with the UberEnvironment2 I've been troubled - I just cannot seem to get rid of this belief that the image plugged into the Light > Basic > Color should be an angular mapped image, not a lat/long mapped image. Every piece of documentation (and the lat/long mapped HDR .tif images that come with DS4) tells me I'm wrong, but every test I do contradicts the documentation.

I mentioned this on one of Casuals threads (don't go and read the thread till you've tried the test below though!), and was shown some experiments that persuaded me that I was wrong.


However, I've just done it again. If lat/long mapping is correct, please can somebody explain the results of this very, very simple test. I'm not even including screenshots as I'd like other people to try it:

1) Open DAZ Studio 4 (I'm still using 4.5.1.6 64-bit)
2) Create > New Primitive > Cube
3) Load the UberEnvironment2 (Light Presets > omnifreaker > UberEnvironment2 > !UberEnvironment2Base.duf)
4) Apply a lighting set (Set HDR KHPark.duf)
5) Set the highest quality (Set Quality 4xHi.duf)
6) With the UberEnvironment2 selected go to the Parameters tab > Light > Basic and set Environment mode to 'Ambient (no ray tracing)'
7) Replace the Light > Basic > Color image (if you used the KHPark.duf it'll be set to OmKHPark_EnvM.tif) with the attached 128x64 jpg image.

8) BEFORE you render (do this bit in your head as you won't see anything in the preview) look at the image attached to this post and write down what colour each of the cube faces should be - front, back, top, bottom, left and right - if this is applied using a lat/long mapping.

9) Render from several different angles and note down the actual colour of each face. My guess is that it doesn't match what you'd predicted... Or is it just me?

test5.jpg
128 x 64 - 1022B
Post edited by 3dcheapskate on
«13456

Comments

  • VaskaniaVaskania Posts: 5,900
    edited December 1969

    There's a bug in UE2. It was on the bug tracker, but since they moved the tracker, you can't get to it. The rotation of the UE2 sphere doesn't match where the light is apparently coming from.

    Here's the thread: http://www.daz3d.com/forums/discussion/18998/

    The detailed workaround, with actual numbers, is in post #4.

  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    Yeah, the Environment sphere is about 22degs out in azimuth iirc? I always delete the Environment Sphere as soon as I've loaded the UE2 anyway (partly due to that issue).

    This, however, is something completely different... (I'm 99% certain! ;o)

    I've attached another two images to plug into Light > Basic > Color.

    test3.jpg
    128 x 64 - 3K
    test2.jpg
    128 x 64 - 3K
    Post edited by 3dcheapskate on
  • VaskaniaVaskania Posts: 5,900
    edited September 2013

    Where you say *before the render.. etc*, I'm not seeing any actual color being projected onto the cube. I only see color if I render.

    /edit
    Those other 2 you posted come out in a lovely shade of 'deadly pink'. Ouch, my eyes.

    ColorCube2.jpg
    800 x 800 - 35K
    ColorCube.jpg
    1000 x 547 - 107K
    Post edited by Vaskania on
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    Vaskania said:
    There's a bug in UE2. It was on the bug tracker, but since they moved the tracker, you can't get to it. The rotation of the UE2 sphere doesn't match where the light is apparently coming from.

    Here's the thread: http://www.daz3d.com/forums/discussion/18998/

    The detailed workaround, with actual numbers, is in post #4.

    I've had a quick look through that thread, and added a post to the end...

    Post edited by 3dcheapskate on
  • VaskaniaVaskania Posts: 5,900
    edited September 2013

    What am I supposed to see? Green sides and a red top/bottom?

    /edit
    plugging in those maps also override any diffuse color on the cube rather than multiplying with it.

    Post edited by Vaskania on
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    Vaskania said:
    Where you say *before the render.. etc*, I'm not seeing any actual color being projected onto the cube. I only see color if I render.

    /edit
    Those other 2 you posted come out in a lovely shade of 'deadly pink'. Ouch, my eyes.

    The 'before you render' is simply a bit of figuring it out in your head (I've added a bit to the OP to clarify). Just look at the image attached to the first post and imagine it applied as a lat/long map. If you're familiar with how a lat/long map is pasted onto a sphere you should be able to work it out.

    The results you've got are exactly the same as I'm getting, and are to my mind proof positive that the mapping can't be lat/long.

    Post edited by 3dcheapskate on
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    Vaskania said:
    What am I supposed to see? Green sides and a red top/bottom?

    /edit
    plugging in those maps also override any diffuse color on the cube rather than multiplying with it.

    Yup, that's what I'd expect if the mapping was lat/long. I just wanted somebody else to confirm it, without any prompting, to prove to myself that I wasn't away in cloud-cuckooland! ;o) (I spend a lot of my time there... it's rather lovely, and restful)

    Post edited by 3dcheapskate on
  • VaskaniaVaskania Posts: 5,900
    edited September 2013

    Well then, glad to have helped verify you're not completely insane ( at least not in this regard, apparently :P ).

    /edit
    I just took a look at the default TIF's for the env. None of them are straight across banding like yours is, which may be why yours is messing up.

    Post edited by Vaskania on
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited December 1969

    I've just downloaded tofusan's IBL Transformer from the thread you mentioned. Tried using that on the image in the first post... and...

    ...same problem. Green sides and yellow top.

    So, my initial question still stands, even after using IBL Transformer.

  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    Vaskania said:
    Well then, glad to have helped verify you're not completely insane ( at least not in this regard, apparently :P ).

    /edit
    I just took a look at the default TIF's for the env. None of them are straight across banding like yours is, which may be why yours is messing up.

    My images are specifically test images. Basic scientific principle, something like...
    - make your observations (many and varied since April!)
    - come up with a theory (the mapping is not lat/long)
    - come up with a test and predict the results based on your theory (these images are the test, green sides and red top with th OP image would prove me wrong; green front, all other faces yellow would tend to support my theory that the mapping's angular)
    - perform the test and check whether the results fit your theory (hmmm... green sides, yellow top and bottom? I think the jury's out on that one... :o)

    Post edited by 3dcheapskate on
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    (another very important part of the scientific principle is to record your results accurately!)

    Correction: my actual results with just the UE2, cube, and the OP image (green with red band at top and bottom) were:
    - Top/Bottom: Yellow - 236,255,5/234,255,2
    - Back: Yellow - 255,255,2
    - Front: Green - 0,255,3
    - Left/Right: Green - 88,255,4/89,255,3

    Colours recorded by doing renders from top,front,etc cameras, saving as PNG and checking the colours in GIMP


    Using the IBL Transformer with IBL file(UE2 radiobutton), no env sphere, no AO:
    - Back/Front: yellow
    - Left,right,top,bottom: green

    Using UE2 with IBL file (Auto radiobutton), no env sphere, no AOn:
    - Top/Bottom: yellow
    - All sides: green

    (didn't record precise colours when using IBL Transformer, since the green/yellow seems to indicate the same issue as the standard UE2)

    Post edited by 3dcheapskate on
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited December 1969

    Enough for now - time to go soak my overheating brain in a bowl of cool fresh water...

  • adamr001adamr001 Posts: 22
    edited September 2013

    Everything is as expected as far as I can tell. The top and bottom are yellow because Red + Green = Yellow when you're talking about additive color mixing.

    Post edited by adamr001 on
  • Fuzzy GnomeFuzzy Gnome Posts: 0
    edited December 1969

    Try this image.

    test_8.png
    300 x 300 - 3K
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited December 1969

    Try this image.

    I haven't tried this one yet, but here's what I'd expect:

    - If the mapping UE2 applies to the color image is angular I'd expect:
    front=red,
    top=green,
    back=blue,
    left/right/bottom=black.
    (Since the image is an angular map I'd expect the colours to be near enough spot-on (i.e. red=255,0,0, green=0,255,0, blue=0,0,255, black=0,0,0)

    - If the mapping UE applies to the color image is lat/long I'd expect:
    front=red, but with a bit of green and black - say 236,20,0
    left/right=black, but with a bit of green and maybe a touch of blue - say 0,5,5
    top/bottom/back=blue, the top may have a touch of green (say 0,5,255), bottom/back a touch of black (say 0,0,250)
    (this assumes that UE2 applies some blurring/convolution to the image, which seems to be the case from the tests I did with the images in the first and second posts.

    Off to try it out now... back soon... (I promise I won't edit this post!)

  • adamr001adamr001 Posts: 22
    edited December 1969

    Try this image.

    This image is in the wrong size dimensions for a UE2 map.

    It must be 2x width to 1x height. E.g. 1024 x 512

  • 3dcheapskate3dcheapskate Posts: 1,167
    edited December 1969

    My results:
    -Front=reddish-orange
    -Top=pale green
    -Left=mid/dark-grey
    -Back=blue
    -Bottom=dark purple
    -Right=dark grey

    bbr.jpg
    942 x 588 - 134K
    tfl.jpg
    941 x 588 - 139K
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited December 1969

    adamr001 said:
    Try this image.

    This image is in the wrong size dimensions for a UE2 map.

    It must be 2x width to 1x height. E.g. 1024 x 512

    Simple to check - resize FuzzyGnome's image and try that instead - just off to do that.

  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    Resized the image in GIMP to 300x150. Fired up DS4.5 again and applied the correctly sized image. Here's what I got. Looks the same to me. I'm fairly sure already (from lots of empirical testing) that the UE2 takes whatever size image you plug in and converts it to the size it needs - whether that's a 2:1 lat/long or 1:1 angular mapping ratio is a moot point.

    test_8a.png
    300 x 150 - 1K
    bbr8a.jpg
    765 x 543 - 83K
    tfl8a.jpg
    755 x 661 - 102K
    Post edited by 3dcheapskate on
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    These two images are how I determined what to expect from Fuzzy Gnome's test image (except I did it in my head). The most important point is the faces which would clearly be different colours with the two mappings, even after taking account of any blurring/convolution the UE2 applies - for this case I'd say it's the top and bottom.

    test_8-as-AM.jpg
    512 x 512 - 93K
    test8a-as-LL.jpg
    512 x 256 - 51K
    Post edited by 3dcheapskate on
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    ...and the same for the red/green image in the OP. For this one the top, bottom and back would be the main clues, with the back being the one that clinches it.

    test5-as-AM.jpg
    512 x 512 - 98K
    test5-as-LL.jpg
    512 x 256 - 44K
    Post edited by 3dcheapskate on
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited December 1969

    The more I look at this the more certain I am - but I have yet to find that piece of proof that makes it 100% certain or blows it out of the water completely. The blurring/convolution that the UE2 seems to apply muddies the waters. I don't have a clue exactly how much is applied or what method is used, and I don't know enough about blurring/convolution methods to make any sensible comments anyway!

    Enough for now, once again. Time to let other people look into it a bit more - judging by the number of views there are a few people who are interested in this...

    I'll be back later today or tomorrow.

  • Takeo.KenseiTakeo.Kensei Posts: 1,052
    edited December 1969

    I didn't run the test but first thing I'd say is that you didn't prepare your image for UE.
    Read at the bottom http://www.omnifreaker.com/index.php?title=UberEnvironment
    Get Omnifreaker's utility and convert your jpg first

    Second point : like Adam said, colors are mixed. It seems to me you're awaiting a single color taken from the normal of the cube which is not the case
    Thing is UE samples over a half sphere and averages the result. Eventually, If we could control the sampling angle and narrow it to a few degrees you could get what's in your mind but there I'd ask myself wether light behave that way

  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    I didn't run the test but first thing I'd say is that you didn't prepare your image for UE.
    Read at the bottom http://www.omnifreaker.com/index.php?title=UberEnvironment
    Get Omnifreaker's utility and convert your jpg first

    Second point : like Adam said, colors are mixed. It seems to me you're awaiting a single color taken from the normal of the cube which is not the case
    Thing is UE samples over a half sphere and averages the result. Eventually, If we could control the sampling angle and narrow it to a few degrees you could get what's in your mind but there I'd ask myself wether light behave that way

    Thanks for the reply. Totally agree on your first point - none of the test images have been prepared in accordance with those instructions. I can accept that, based on that point alone, none of the tests I've done are valid.

    Very simple to resolve - the instructions are fairly precise, so I'll go away and correctly prepare my test images. The biggest problem I see is coming up with a good test image in the first place (your comment that UE2 takes the average from a complete hemisphere should be very helpful in that regard - many thanks)

    Just to be sure I have the steps right:

    1) Start with a 2:1 ratio (like Adam said) test image. A JPG image should be fine according to the instructions you pointed at.

    2) Do the spherical convolution in HDRShop. Using a 128x64 source image and a diffuse blur Phong exponent of 16 is the advice I've seen*. Admittedly that advice is based on an HDR source image, but the UE2 page you pointed to says JPG is fine.

    3) Running tdlmake with the -envlatl option and outputting in .tif format. Isn't that what omHdrConverter does (assuming that I understand the RDNA thread correctly)? So I should be able to open my 2:1 ratio JPG test image in HDRShop, do the spherical convolution, save it as an HDR, and then run omHdrConverter on it.

    If I go through those steps I should have a correctly prepared image. Agreed?


    On the second point, i.e. whether the colour is from a normal or is somehow 'mixed', I accept your point. (I think I already mentioned that I don't understand how UE2 does its blurring/convolution. And my guess in post 15 at what the colours would be if the mapping was lat/long were based on 'mixing' the colours showing in each face section of the lat/long image in post 20 - incorrect I know, but it was clear that some sort of mixing/blurring/convolution is going on)


    Finally, many thanks to yourself and Adam (and everybody else) for taking time to respond to this. Now time to go away and properly prepare some test images (although I don't think this forum allows you to upload TIFs?)


    *The procedure here - http://forum.runtimedna.com/showthread.php?45437-How-to-use-UberEnvironment2-with-your-own-HDR-pictures&p=511581 is one I've used once before for converting an HDR images to a UE2-compatible TIF

    Post edited by 3dcheapskate on
  • adamr001adamr001 Posts: 22
    edited December 1969

    *The procedure here - http://forum.runtimedna.com/showthread.php?45437-How-to-use-UberEnvironment2-with-your-own-HDR-pictures&p=511581 is one I've used once before for converting an HDR images to a UE2-compatible TIF
    This looks like a valid process. I haven't actually done it this way though I'm going to try it.
  • 3dcheapskate3dcheapskate Posts: 1,167
    edited September 2013

    I've just done a couple of quick tests using the image from the OP, correctly prepared, and I'm totally surprised by the result...

    ...because using a plain JPG image that looks identical to the correctly prepared TIF gives completely different results!

    So it looks very much as if Takeo.Kensei hit the nail on the head - it's all in the preparation.

    I need to do a few more checks tomorrow to be 100% sure in my own mind, but it looks like using a correctly prepared lat/long source image (with the omHDRConverter bit being absolutely key... okay, I'll admit, exactly as it says in the manual! ;o) plus the IDL Transformer is the way to get UE2 to work properly.

    Just like people were saying all along... :coolgrin:

    Post edited by 3dcheapskate on
  • HoroHoro Posts: 5,430
    edited December 1969

    This is a very interesting thread. I've found out 2 years ago that the light and environment spheres are 35° offset in azimuth. I have not yet tested elevation - or colours. Since the light - the 96-bit per pixel HDRI (whether hdr or tiff) is blurred by specular convolution colours tend to get mixed.

    The first render shows that the shadow is offset from the light. The second has the hdri/env sphere rotated so that the cone points to where the bright light actually comes from.

    The third picture made the environment sphere. The grid was modeled in Bryce as a sphere, the camera set in its centre and around an HDRI (Tourbillion Tower), tone-mapped as backdrop. The 6 cube faces were rendered in Bryce and assembled to a spherical panorama.

    The last picture shows the HDRI used, specular convolved with a Phong exponent of 250 (16 seems very low to me, approaching the effect of diffuse convolution.

    This is how far I came. I used to offset the small HDRI against the env-sphere right when creating the set for using in UE2. In the long run, I think that is not a satisfactory method.

    Ex4.jpg
    800 x 400 - 12K
    Ex3.jpg
    800 x 400 - 113K
    Ex2.jpg
    800 x 530 - 85K
    Ex1.jpg
    800 x 530 - 85K
  • HoroHoro Posts: 5,430
    edited December 1969

    As far as testing is concerned, a sphere gives a better representation than a cube. A cube has to be aligned precisely otherwise colour mixing occurs. There is also colour mixing on a sphere, of course, the direction from which the light shines can be better seen.

    The size of the color lights is also very important. Wide light sources mix sooner than smaller ones. It is therefore difficult to get a valid test using a specular convolved HDRI.

    This is shown in the 4 render below. I apologise these are Bryce renders, because (1) I'm a total DS nut (though I'm about learning it a bit; (2) Bryce IBL works flawlessly because there is only one image: the HDRI produces the light and the backdrop (and can create the specular map and export HDRI, specular map and tone-mapped backdrop in several projections); (3) I know how to use it.

    There is quite a wide camera angle, the extreme wide angle lens is used to show a lot of the backdrop. The inset at lower right is the HDRI used (shown as an angular map - Bryce can use the HDRI in the angular map and spherical projection).

    Example 1 shows how colours are mixed when using an HDRI with wide lights.

    Example 2 shows the same setup with an HDRI with small lights (unfortunately with the colours at different places).

    Example 3 has only two small lights left and right.

    Example 4 has the lights above and below.

    ExTB.jpg
    600 x 600 - 12K
    ExLR.jpg
    600 x 600 - 13K
    ExCubeSpotA.jpg
    600 x 600 - 16K
    ExCube.jpg
    600 x 600 - 35K
  • Takeo.KenseiTakeo.Kensei Posts: 1,052
    edited September 2013

    Horo said:
    This is a very interesting thread. I've found out 2 years ago that the light and environment spheres are 35° offset in azimuth. I have not yet tested elevation - or colours. Since the light - the 96-bit per pixel HDRI (whether hdr or tiff) is blurred by specular convolution colours tend to get mixed.

    The first render shows that the shadow is offset from the light. The second has the hdri/env sphere rotated so that the cone points to where the bright light actually comes from.

    The third picture made the environment sphere. The grid was modeled in Bryce as a sphere, the camera set in its centre and around an HDRI (Tourbillion Tower), tone-mapped as backdrop. The 6 cube faces were rendered in Bryce and assembled to a spherical panorama.

    The last picture shows the HDRI used, specular convolved with a Phong exponent of 250 (16 seems very low to me, approaching the effect of diffuse convolution.

    This is how far I came. I used to offset the small HDRI against the env-sphere right when creating the set for using in UE2. In the long run, I think that is not a satisfactory method.

    The grid is a very good Idea. I knew UE and the environement sphere were off but was too lazy to do some proper testing to find out why and how to correct that. However Some people did it, but I don't know either if they got it correct too. Can you try to render with the IBL transformer found here http://oso.tea-nifty.com/blog/2011/10/daz-studio-34-i.html so that we could check if it's 22 or 35 ° off or any other value?
    [Edit] Forget that if these are bryce renders

    @Cheapstake : The procedure at Rendo is a nice find. I wish I knew that thread it last year :) I would have understand some things quicker

    Post edited by Takeo.Kensei on
  • HoroHoro Posts: 5,430
    edited December 1969

    Can you try to render with the IBL transformer found here http://oso.tea-nifty.com/blog/2011/10/daz-studio-34-i.html so that we could check if it's 22 or 35 ° off or any other value?
    [Edit] Forget that if these are bryce renders

    The offset is 35° but I don't (yet) know whether there is also an offset vertically. I'm working to get to grips with DS in general and UE2 in particular. Though for the Xmas 2012 give-away from the PA we (David and I) had an HDRI included for DS (and Carrara) and for that I had offset the HDRI by 35° to match the backdrop. But there must be a better way to correct for that.
Sign In or Register to comment.