PROBLEM UNDERSTOOD: The output of a Normal Map brick is NOT a tangent-space normal (which is what I thought). It’s the normal in camera-space after it’s been applied to the object. And by the way, camera space uses a left-handed axis system with the camera’s line of sight being the positive Z axis, so normals for ‘forward’-facing faces will have negative values.
STILL TRYING TO SOLVE!

I’ve run into a curious problem while trying to combine normal maps with DS4.5 Shader Mixer. I’ve managed to reduce the problem to something (hopefully) quite straightforward. I’ll start with the bit that works fine.

The four brick network shown here simply takes a normal map, splits it into it’s components and recombines them.

Since a normal is just a unit vector, it should be easy to calculate the third component (Z, blue) from the other two (X, red and Y, green) using simple trigonometry, B = SQRT(1-((R*R)+(G*G)))
(No, there’s not really any point in doing this, but it’s just the simplest way I found to explain the problem)

That’s what this next network does. (The three bricks in the bottom right are simply to verify that the calculated Z(blue) value matches the Z (blue) output of the splitter brick - it does)

I would expect this network to work too, but it doesn’t!
(Edit: discovered an even simpler demonstration of the problem - see the fifth post (#4) for the network I used)

Here’s the results when you apply the two networks to a sphere primitive (I also tried applying plugging the end result into the Diffuse Color input of a Surface brick, and noticed the difference there too - I include those renders as they may provide a clue).

The renders on the left are from the first network and are as expected.

The renders on the right are from the second network, and I can’t understand why they’re so different.

I’ve got to the banging-head-against-the-wall stage with this one.
An explanation as to why on earth the second network behaves so differently, even though the inputs to the final Point brick are to all intents and purposes the same, would be greatly appreciated.

I’ve just discovered an even simpler demonstration of the problem - take the ‘Z Component’ output of the ‘XYZ Components’ brick, multiply it by itself, take the square root (so you should end up with the same value as the ‘Z Component’ that you started with) and plug that into the ‘Z Value’ input of the point brick.

That should definitely work, shouldn’t it?

It doesn’t!

I get the same effect as the network in the second post (i.e. renders like the two images on the right in the third post)

To me this makes it clear that it’s not faulty maths that’s causing the problem!

SOLVED: The output of a Normal Map brick is NOT a tangent-space normal (which is what I thought). It’s the normal in camera-space after it’s been applied to the object.
And by the way, camera space uses a left-handed axis system with the camera’s line of sight being the positive Z axis, so normals for ‘forward’-facing faces will have negative values.

SOLVED: The output of a Normal Map brick is NOT a tangent-space normal (which is what I thought). It’s the normal in camera-space after it’s been applied to the object.
And by the way, camera space uses a left-handed axis system with the camera’s line of sight being the positive Z axis, so normals for ‘forward’-facing faces will have negative values.

Thanks. I didn’t understand why I obtain results with negative normal and not with positive. I didn’t think about changing axis when space is changed.

SOLVED: The output of a Normal Map brick is NOT a tangent-space normal (which is what I thought). It’s the normal in camera-space after it’s been applied to the object.
And by the way, camera space uses a left-handed axis system with the camera’s line of sight being the positive Z axis, so normals for ‘forward’-facing faces will have negative values.

A lot simpler explanation than what I was working on when I lost power for 4 days…

Is there any hope of making a shader brick that creates a normal map for something like my Bruno skin for V4/Genesis/Genesis 02 Female??

I may be tracking something that applies the Displacement Map as a Normal Map….

[EDIT: No. False Alarm. I still know VERY little about Normal Mapping, but I did get some progress toward aligning the Normal-Map with the UV instead of with the camera.]

Simple: Plug the Normal Map and an NTransform brick into a Binary Functional set to Multiply, and set the NTransform at [From: current To: object] and plug it in wherever you need to.

Simple’s good! So the NTransform brick is the key - hidden in plain sight! (that’s often my blind spot ;o)
It’s such a long time since I was playing with normals that I’ve forgotten most of it, but it sounds like something along these lines is what I was after. Maybe time to dust off this project and take another look…
Thanks!

I’m sorry for the error in my last post: it should have read “From Current to Object” for the NTransform brick. The image is right. Post edited to match.