Who said Blender was hard?

1606163656669

Comments

  • Thanks, yeah as long as I know there are differences between how Eevee and Cycles render, it's fine. I'd render the final pic in Cycles anyway.

    I do have one more question if it's OK, because this will probably determine whether I should bother with learning Blender at all. Can you render alpha channels for individual surfaces? I absolutely rely on doing this in Studio for postwork in Photoshop, and if Blender can only render alpha channels for whole objects, it won't be enough for what I need.

    I would be very surprised if it isn't possible, but I'm afraid I am not completely clear what it is that you want to do ('rendering alpha channels for individual surfaces').  Could you explain what you are doing in DS - perhaps with (a) screenshot(s)?

  • probably asekd and answered in here but anyone know a video where I can see how to work in 2.8 after being forced into 2.79 until two weeks ago becuase my old GPU was poo and not supported?

    You might try this from the ubiquitous Andrew Price:

    Disclaimer - I haven't watched it yet so not exactly sure what it contains, going by the title 'Where everything is in Blender 2.8'

    (I see Mr Price is sticking steadfastly to '2.8'; I'm leaning towards 2.80 ('two point eighty') because 2.81 ('two point eighty-one') is in alpha already.)

  • marblemarble Posts: 3,608
    edited August 16

    I just watched this one because I'm a bit bewildered about the best way to start learning. The guy has a delivery like an automated railway announcement but he does deliver good advice.

     

    Post edited by marble on
  • donalddadedonalddade Posts: 204
    wolf359 said:

    I just can't see it "just working". Renaming bones so they match is one thing, but doing all the Optimization, Linear Algebra and Quaternions to make it really look correct, even when there are extra bones, or missing bones, or bones of different lengths, is another. I noticed that even Reallusion didn't attempt it and licensed the tech from iKinema (or was it autodesk?). But Daz gives it away for free, and doesn't tell anyone? But I still want to try it... 

     

     

    Reallusion uses the Maya Human IK system from Autodesk.

    The 3Dxchange retargeting system in Iclone pro is every bit as powerful as Motionbuilder for retargeting to non matching skeletons and being able to sSave the retargeting template later use.

    As was Stated, the G1 to G3/G8 trick  is a decent "last resort" work around to get rough,base layer ,legacy bodymotion onto a G3/8 figure in Daz studio  if you have no access to proper retargeting tools like Iclone, Ikinema or MOBU.


     However be prepared to use the graph editor in Daz studio to globally fix hands and other limb parts that "go astray"

    The new IK system, in the 4.12 beta ,could help alot with foot slide after some manual setup , I am given to understand.

    I personally have no use for it as I use Iclone pro and bake my IK to FK upon export to BVH .

    Thanks for the clarification, Wolf... I bought pretty much the whole suite of RL stuff, but was extremely disappointed when their much tauted support for G8s in CC3 did not support JCMs, even after they said "fully" supported. I kind of rage-quit RL after that. But 3DXchange as a cheap MB is not something that one can ignore. When WebAnimate didn't support G8s either, I was afraid that I might be stuck with MB.

  • nonesuch00nonesuch00 Posts: 11,554
    wolf359 said:

    I just can't see it "just working". Renaming bones so they match is one thing, but doing all the Optimization, Linear Algebra and Quaternions to make it really look correct, even when there are extra bones, or missing bones, or bones of different lengths, is another. I noticed that even Reallusion didn't attempt it and licensed the tech from iKinema (or was it autodesk?). But Daz gives it away for free, and doesn't tell anyone? But I still want to try it... 

     

     

    Reallusion uses the Maya Human IK system from Autodesk.

    The 3Dxchange retargeting system in Iclone pro is every bit as powerful as Motionbuilder for retargeting to non matching skeletons and being able to sSave the retargeting template later use.

    As was Stated, the G1 to G3/G8 trick  is a decent "last resort" work around to get rough,base layer ,legacy bodymotion onto a G3/8 figure in Daz studio  if you have no access to proper retargeting tools like Iclone, Ikinema or MOBU.


     However be prepared to use the graph editor in Daz studio to globally fix hands and other limb parts that "go astray"

    The new IK system, in the 4.12 beta ,could help alot with foot slide after some manual setup , I am given to understand.

    I personally have no use for it as I use Iclone pro and bake my IK to FK upon export to BVH .

    Thanks for the clarification, Wolf... I bought pretty much the whole suite of RL stuff, but was extremely disappointed when their much tauted support for G8s in CC3 did not support JCMs, even after they said "fully" supported. I kind of rage-quit RL after that. But 3DXchange as a cheap MB is not something that one can ignore. When WebAnimate didn't support G8s either, I was afraid that I might be stuck with MB.

    By this time next year DAZ Studio will have equaled or surpassed most of those features & the pipeline features are already in DAZ Studio. Really only the animation capabilities are lacking.

  • SnowSultanSnowSultan Posts: 2,285

    Andya, sorry for the delay and if I wasn't clear enough before. Below is an example of a mask from a pic I'm currently working on. I only wanted to mask the skin of these wings, not the bones or limbs. I've seen Blender export masks of individual objects, but I don't yet know if it can create these sorts of masks from specific materials or object surfaces.

    wing2_SelectedNodes.jpg
    2200 x 1424 - 172K
  • nicsttnicstt Posts: 8,479

    Thanks, yeah as long as I know there are differences between how Eevee and Cycles render, it's fine. I'd render the final pic in Cycles anyway.

    I do have one more question if it's OK, because this will probably determine whether I should bother with learning Blender at all. Can you render alpha channels for individual surfaces? I absolutely rely on doing this in Studio for postwork in Photoshop, and if Blender can only render alpha channels for whole objects, it won't be enough for what I need.

    You can pretty much tell the shader to do anything you want; shaders can get very complicated, and do.

    I haven't done much in Cycles for a couple or more years now, but I used to use it often; the car in my profile is one example.

  • wolf359wolf359 Posts: 2,435
    edited August 16

    I bought pretty much the whole suite of RL stuff, but 

    was extremely disappointed when their much tauted 

    support for G8s in CC3 did not support JCMs, even after 

    they said "fully" supported. I kind of rage-quit RL after 

    that. But 3DXchange as a cheap MB is not something 

    that one can ignore.

    Understand that CC3's 
    Genesis "support" is essentially a shape projection algorithum similar to the GENX 2 Plugin for Daz studio.

    You get a an Iclone Base Avatar "Doppleganger" of whatever G3/8 Character you imported via FBX
    so the Daz JCM's or HD morphs are never part of the equation .

     

    By this time next year DAZ Studio will have equaled or 

    surpassed most of those features & the pipeline features 

    are already in DAZ Studio. 

    Is there new info the Change log that confirms this ??

    One of the main pipeline features of Iclone 3DX is the ability to import an FBX rig from very nearly every other character program and apply realtime Imotion Data to it and export the Data back to your external program as BVH or FBX or even Alembic.

    Daz studio has no useful FBX import capbility, that I have seen demonstrated,even in 4.12 beta. 


    Really only the animation capabilities are lacking.

    As well as the Live face ,camera based  facial animation mocap system and support for the full body motion capture hardware that is optionally available for Iclone as well as a realtime live link to unreal 4

    If Daz studio"equals or surpasses"  these features by this time next year I would love to know where this roadmap of advanced animation pipeline features has been published by Daz.

    Post edited by wolf359 on
  • andya_b341b7c5f5andya_b341b7c5f5 Posts: 549
    edited August 16

    Andya, sorry for the delay and if I wasn't clear enough before. Below is an example of a mask from a pic I'm currently working on. I only wanted to mask the skin of these wings, not the bones or limbs. I've seen Blender export masks of individual objects, but I don't yet know if it can create these sorts of masks from specific materials or object surfaces.

    No problem.  I think you can do what you want, if I have got the idea correctly.  You will have to render in Cycles and use the compositor. 

    For example, to get a mask based on a particular material or materials, I have a cube with two materials, one plain red and one blue with a check texture as a transmap, sitting on a plane with a green material of it's own.  You can see the blue material set up and the render preview in the first image.

    In the View Layer tab, find the Cryptomatte subsection in the Passes section and select the Material button.  Hit F12 to render.

    Now click on the Compositing workspace tab along the top of the viewport.  You will have a Render Layer node and a Composite node connected to it.  Add a Viewer node and click on the Backdrop button at the top right of the viewport (see second image).  You should see your render behind the nodes as it appeared in the render window.

    Next, add a Cryptomatte node between the Render Layer and the Viewer node and connect them up as shown in the third image.  This should give you the 'Pick View' behind the nodes, with each material identified by a different color.  Click the Add button in the Cryptomatte node, and use the eyedropper to select the material you want to create a mask for e.g. I selected the greenish color representing the blue material with the transmap.  You should see an odd looking value appear in the field under the Add button.

    And now the magic happens.  If you connect the Image output from the Cryptomatte node to the Image input of the Viewer node you will see an image with just the selected material, and if you connect the Matte output to the Image input you will see what is essentially an alpha mask for your selected material (image four).

    To save this as an image, connect the Matte output from the Cryptomatte node to the image input of the Composite node.  Your render window should now show the mask image, and you can save it as normal from the Image menu (picture 5).

    You can select multiple materials on mutliple objects - for the last image, I picked the red material on the cube and the plane material, after removing the blue material (using the Remove button and selecting with the eyedropper again).  Of course a semi-transparent area on a material will give you a corresponding grey color in your alpha mask.

    Hope that is clear enough to follow.

    alpha-mask01.png
    1920 x 1024 - 670K
    alpha-mask02.jpg
    1920 x 1024 - 300K
    alpha-mask03.jpg
    1920 x 1024 - 317K
    alpha-mask04.jpg
    1920 x 1024 - 304K
    alpha-mask05.jpg
    2006 x 1384 - 95K
    alpha-mask06.jpg
    1920 x 1024 - 309K
    Post edited by andya_b341b7c5f5 on
  • nonesuch00nonesuch00 Posts: 11,554
    wolf359 said:

    I bought pretty much the whole suite of RL stuff, but 

    was extremely disappointed when their much tauted 

    support for G8s in CC3 did not support JCMs, even after 

    they said "fully" supported. I kind of rage-quit RL after 

    that. But 3DXchange as a cheap MB is not something 

    that one can ignore.

    Understand that CC3's 
    Genesis "support" is essentially a shape projection algorithum similar to the GENX 2 Plugin for Daz studio.

    You get a an Iclone Base Avatar "Doppleganger" of whatever G3/8 Character you imported via FBX
    so the Daz JCM's or HD morphs are never part of the equation .

     

    By this time next year DAZ Studio will have equaled or 

    surpassed most of those features & the pipeline features 

    are already in DAZ Studio. 

    Is there new info the Change log that confirms this ??

    One of the main pipeline features of Iclone 3DX is the ability to import an FBX rig from very nearly every other character program and apply realtime Imotion Data to it and export the Data back to your external program as BVH or FBX or even Alembic.

    Daz studio has no useful FBX import capbility, that I have seen demonstrated,even in 4.12 beta. 


    Really only the animation capabilities are lacking.

    As well as the Live face ,camera based  facial animation mocap system and support for the full body motion capture hardware that is optionally available for Iclone as well as a realtime live link to unreal 4

    If Daz studio"equals or surpasses"  these features by this time next year I would love to know where this roadmap of advanced animation pipeline features has been published by Daz.

    No, there is no official DAZ 3D confirmation of that. It's just me talking out the side of my mouth but I think it will be the case. I don't know what iMotion data is. I have imported FBX models before and you are right they often fail but I will start filing tickets in the future when they fail so at least they have a record of who is trying to use FBX in DAZ for what.

    The FBX export I've done has worked well in Unity with the observation that they are 'good quality' for games but not DAZ Studio iRay render quality. 

  • donalddadedonalddade Posts: 204
    wolf359 said:

    I just can't see it "just working". Renaming bones so they match is one thing, but doing all the Optimization, Linear Algebra and Quaternions to make it really look correct, even when there are extra bones, or missing bones, or bones of different lengths, is another. I noticed that even Reallusion didn't attempt it and licensed the tech from iKinema (or was it autodesk?). But Daz gives it away for free, and doesn't tell anyone? But I still want to try it... 

     

     

    Reallusion uses the Maya Human IK system from Autodesk.

    The 3Dxchange retargeting system in Iclone pro is every bit as powerful as Motionbuilder for retargeting to non matching skeletons and being able to sSave the retargeting template later use.

    As was Stated, the G1 to G3/G8 trick  is a decent "last resort" work around to get rough,base layer ,legacy bodymotion onto a G3/8 figure in Daz studio  if you have no access to proper retargeting tools like Iclone, Ikinema or MOBU.


     However be prepared to use the graph editor in Daz studio to globally fix hands and other limb parts that "go astray"

    The new IK system, in the 4.12 beta ,could help alot with foot slide after some manual setup , I am given to understand.

    I personally have no use for it as I use Iclone pro and bake my IK to FK upon export to BVH .

    Thanks for the clarification, Wolf... I bought pretty much the whole suite of RL stuff, but was extremely disappointed when their much tauted support for G8s in CC3 did not support JCMs, even after they said "fully" supported. I kind of rage-quit RL after that. But 3DXchange as a cheap MB is not something that one can ignore. When WebAnimate didn't support G8s either, I was afraid that I might be stuck with MB.

    By this time next year DAZ Studio will have equaled or surpassed most of those features & the pipeline features are already in DAZ Studio. Really only the animation capabilities are lacking.

    That, all of it, is news to me. Do you have a source I can read?
  • donalddadedonalddade Posts: 204
    wolf359 said:

    I bought pretty much the whole suite of RL stuff, but 

    was extremely disappointed when their much tauted 

    support for G8s in CC3 did not support JCMs, even after 

    they said "fully" supported. I kind of rage-quit RL after 

    that. But 3DXchange as a cheap MB is not something 

    that one can ignore.

    Understand that CC3's 
    Genesis "support" is essentially a shape projection algorithum similar to the GENX 2 Plugin for Daz studio.

    You get a an Iclone Base Avatar "Doppleganger" of whatever G3/8 Character you imported via FBX
    so the Daz JCM's or HD morphs are never part of the equation .

     

    By this time next year DAZ Studio will have equaled or 

    surpassed most of those features & the pipeline features 

    are already in DAZ Studio. 

    Is there new info the Change log that confirms this ??

    One of the main pipeline features of Iclone 3DX is the ability to import an FBX rig from very nearly every other character program and apply realtime Imotion Data to it and export the Data back to your external program as BVH or FBX or even Alembic.

    Daz studio has no useful FBX import capbility, that I have seen demonstrated,even in 4.12 beta. 


    Really only the animation capabilities are lacking.

    As well as the Live face ,camera based  facial animation mocap system and support for the full body motion capture hardware that is optionally available for Iclone as well as a realtime live link to unreal 4

    If Daz studio"equals or surpasses"  these features by this time next year I would love to know where this roadmap of advanced animation pipeline features has been published by Daz.

    You're right about everything, of course. I was rather new to the animation world, and didn't yet understand that "fully supported" really means "just the easiest parts we could implement". I discovered everything you said only after I had drank deeply of the CC3 Koolaid. Out of all the software and motion capture plugins I bought, the only thing I use is 3Dxchange, and that just for retargeting to import back into Daz
  • SnowSultanSnowSultan Posts: 2,285

    Thanks again Andya, you obviously know Blender very well and I know from your explanation that it is still ridiculously user-unfriendly.  :)   It apparently took 21 freaking years for the Blender team to add a camera rotation widget, so I'm not going to hold my breath on them making material ID renders any easier either (speaking of which, like Studio, Blender doesn't anti-alias it's ID maps either? Are there any actual artists working on these programs?).

    No question Blender is powerful, but it's still not for me. I appreciate the help though and thanks for taking the time to answer my questions.

  • wolf359wolf359 Posts: 2,435

     I discovered everything you said only after I had drank deeply of the CC3 Koolaid. Out of all the software and motion capture plugins I bought, the only thing I use is 3Dxchange, and that just for retargeting to import back into Daz.

     

    Indeed, that is my Iclone retargeting Pipeline as well.
    My exported Genesis Characters never travel beyond the 3D exchange app where they receive  my realtime Imotion Data, I create in iclone, and export the BVH Back to the "actor"  in Daz studio for Final tweaking,Lipsynch and costuming before export as .obj/MDD to Maxon C4D to render.

    Iclone nor Daz studio  is not a suitable final render environment for my types of productions
    because large, complex scene management ( Hundreds of scene items in nested hierarchies),
    is Mission Impossible in Iclone&Daz studio, to say nothing of the rather poor compositing&VFX options. 
       

  • Does anyone happen to have any links to setting up opacity/transparency maps in Blender 2.8? I found a couple of videos, but they seem overly complicated and I wonder if there's an easier way just to apply transmaps for things like eyelashes, leaves, and such.

    Do you use Eevee ( which set as default render engine in 2.8 ) or Cycles ? if you`ve used Eevee , you need to setup MATERIAL-TAB --> OPTIONS --> Blend Mode = Alpha Clip  or any Aplha option which suit you . 

     

  • probably asekd and answered in here but anyone know a video where I can see how to work in 2.8 after being forced into 2.79 until two weeks ago becuase my old GPU was poo and not supported?

    Sadly... unless your GPU support Open GL 3.3 , I dont see another workaround/cheat/trick to makes it work 

  • it is still ridiculously user-unfriendly.  :)  

    Yes, in this case not a cinch for sure.  I'm no expert on UX/UI design, but I see the tension between ease of use and power/flexibility is unresolved in many software applications, so it must be a hard problem.  If one is interested in doing thing A at present in some complex application, then it can be hard to do, and the fact one can also do things B through Z is no consolation at that particular time!

    speaking of which, like Studio, Blender doesn't anti-alias it's ID maps either? Are there any actual artists working on these programs?).

    Well, I think Blender does, but you have to 'play' with the pixel filter width settings a bit (Render tab->Film->Pixel Filter).  The default 1.5 pixels may not always give the best results.  (I seem to recall there was a checkbox to toggle anti-aliasing on/off in versions prior to 2.80, but seems you can't disable it entirely now.)

  • SnowSultanSnowSultan Posts: 2,285

    I think that's partially caused by the differences between how programmers and artists think. A programmer's priority is to get the feature into the software; if it's difficult to actually use, that's not their problem. An artist comes up with the idea for the feature but has no clue how to actually implement it.   :)

    I won't give up on Blender completely, but I can't think about making it my primary program until it can render these masks without *too* much work and as long as it has trouble handling DAZ content. What frustrates me is that they've had 21 years to work on this software with a ton of community support and yet it's still rather unnecessarily complicated and unintuitive compared to other similar programs. Apparently not too many right-brained members in that community, heh.

    Thanks again for your help so far.

  • Mythic3DMythic3D Posts: 1,481
    edited August 18

    Andya, sorry for the delay and if I wasn't clear enough before. Below is an example of a mask from a pic I'm currently working on. I only wanted to mask the skin of these wings, not the bones or limbs. I've seen Blender export masks of individual objects, but I don't yet know if it can create these sorts of masks from specific materials or object surfaces.

    Not sure if this got answered yet as I'm still catching up on the thread, but you can easily do this with cryptomatte in Blender.

    The result will be a separate render with a different solid color for each material.  I actually find this much faster than having to render separate masks for each material - you just take the one image into (in my case) Photoshop, colorpick the color and use that selection to add a mask to whatever you are doing. It's two clicks to get a mask and no extra rendering time, which really speeds things up.

    EDIT:  Whoops - well maybe I should finish reading the thread before replying as I see this was actually answered already. My way is a slight variation on the method above. I think andya is trying to perfectly mimic what you currently do which is a slower workflow than this, although my way would require a slight change in how you work.

    1. Turn on the cryptomatte render layer.
    2. Render your image.
    3. Go to Compositing workspace (tab along top).
    4. Add > Matte > Cryptomatte
    5. Connect the Cryptomatte outputs on the Render Layers node to the Cryptomatte inputs on the Cryptomatte node.
    6. Connect the Pick output from the Cryptomatte node to the Image input on the Composite node.
    7. Go the the Rendering workspace.
    8. Image > Save As.

    Sreenshots of the steps below. My example object is a cylinder with three material nodes.

    TURNING ON THE CRYPTOMATTE RENDER LAYER

    NODE SETUP IN THE COMPOSITING WORKSPACE

    COMPOSITED RENDER FOR EXPORT

    01 TurnOnCryptomatte.JPG
    334 x 538 - 27K
    02 ConnectCryptoNode.JPG
    1232 x 555 - 81K
    03 ConnectCryptotoComposite.JPG
    629 x 270 - 26K
    04 CryptoMattedImage.png
    720 x 576 - 235K
    Post edited by Mythic3D on
  • bluejauntebluejaunte Posts: 1,325

    Anyone know how to put a script on a custom button somewhere in the UI? Or something like that?

  • SnowSultanSnowSultan Posts: 2,285

    ArtOfMark, that seems pretty easy, but can you anti-alias that resulting render? I still don't quite know why anyone would need a jagged Material ID render because it's useless for masking in Photoshop, but both DAZ Studio and Blender seem to want to only offer that option.

  • Mythic3DMythic3D Posts: 1,481

    ArtOfMark, that seems pretty easy, but can you anti-alias that resulting render? I still don't quite know why anyone would need a jagged Material ID render because it's useless for masking in Photoshop, but both DAZ Studio and Blender seem to want to only offer that option.

    The composite render using cryptomatte is exactly as anti-aliased as the regular render. Mine just looks "stair stepped" because I rendered at a really low resolution - if I had uploaded the render too you would see that they match up exactly. I just tried it with a more complicated, higher resolution image and it definitely works.

  • SnowSultanSnowSultan Posts: 2,285
    edited August 19

    I'll test it out tonight then, thank you.

    EDIT: I tested it and it works - but I still can't figure out why. The Material ID map is clearly not anti-aliased when I look at it, but when selected and altered, it seems - so far - to give a satisfactory result. Is there something more technical going on with the edges appearing more defined than they really are or with the export format? Just seems weird. I need to test it with a much larger image and make sure my selection tool isn't anti-aliased as well, but this might solve one of the bigger problems I had with Blender. Thank you very much.

    Post edited by SnowSultan on
  • SnowSultanSnowSultan Posts: 2,285
    edited August 20

    EDIT: OK it does seem Cryptomatte is the way to go, but I haven't figured out the options yet. Sorry for the multiple posts, will try to wrangle this soon.

    Post edited by SnowSultan on
  • ArtiniArtini Posts: 4,473

    Andya, sorry for the delay and if I wasn't clear enough before. Below is an example of a mask from a pic I'm currently working on. I only wanted to mask the skin of these wings, not the bones or limbs. I've seen Blender export masks of individual objects, but I don't yet know if it can create these sorts of masks from specific materials or object surfaces.

    Not sure if this got answered yet as I'm still catching up on the thread, but you can easily do this with cryptomatte in Blender.

    The result will be a separate render with a different solid color for each material.  I actually find this much faster than having to render separate masks for each material - you just take the one image into (in my case) Photoshop, colorpick the color and use that selection to add a mask to whatever you are doing. It's two clicks to get a mask and no extra rendering time, which really speeds things up.

    EDIT:  Whoops - well maybe I should finish reading the thread before replying as I see this was actually answered already. My way is a slight variation on the method above. I think andya is trying to perfectly mimic what you currently do which is a slower workflow than this, although my way would require a slight change in how you work.

    1. Turn on the cryptomatte render layer.
    2. Render your image.
    3. Go to Compositing workspace (tab along top).
    4. Add > Matte > Cryptomatte
    5. Connect the Cryptomatte outputs on the Render Layers node to the Cryptomatte inputs on the Cryptomatte node.
    6. Connect the Pick output from the Cryptomatte node to the Image input on the Composite node.
    7. Go the the Rendering workspace.
    8. Image > Save As.

    Sreenshots of the steps below. My example object is a cylinder with three material nodes.

    TURNING ON THE CRYPTOMATTE RENDER LAYER

    NODE SETUP IN THE COMPOSITING WORKSPACE

    COMPOSITED RENDER FOR EXPORT

    Thank you very much for posting these steps. I would never imagine, that it is possible in Blender. What a powerful program it is.

  • SnowSultanSnowSultan Posts: 2,285

    OK, I think I figured it out but it's a bit more difficult than some of these explanations if you want to get these masks over to Photoshop for postwork. You have to set up a viewing node to show the background, then constantly disconnect/switch the image with the matte view to see what masks you're actually putting together with the eyedropper, and then open a new window set to Image Editor to actually save the masks out as individual renders. This tutorial does a pretty good job of explaining it.

     

    Powerful? Absolutely. Intuitive and designed with artists in mind? Hilariously not. Thanks to those who helped put me on the right path to figuring this out though, I appreciate it.

  • OK, I think I figured it out but it's a bit more difficult than some of these explanations if you want to get these masks over to Photoshop for postwork. You have to set up a viewing node to show the background, then constantly disconnect/switch the image with the matte view to see what masks you're actually putting together with the eyedropper, and then open a new window set to Image Editor to actually save the masks out as individual renders.

    I think it can be easier than that.  You can set up as many viewer nodes as you like.  Whichever one you have selected is the one that you will see in the backdrop.  So, for example, one for the full image, one for the pick output, one for the image output of the cryptomatte node.  You could have one for the matte output.  No more connecting/disconnecting.

    You can select to render to a new window, which saves opening an image editor window manually.  Connect the Matte output from the Cryptomatte node to the image input of the Composite node and that is what will appear in your render window.

    Intuitive and designed with artists in mind?

    There are artists employed by the Blender Institute, who have produced several animated 'open movies' (shorts, not features), including 'Spring' which was used as a testing ground for 2.80 while it was in development.  (See also 'Big Buck Bunny', 'Cosmos Laundromat', 'Elephants Dream' for previous releases).  I have to presume they were, are, and will be consulted as part of the ongoing development process. 

    https://www.youtube.com/watch?v=WhWc3b3KhnY

    https://www.blender.org/press/spring-open-movie/

    viewers01.jpg
    1920 x 1024 - 323K
    viewers02.jpg
    1920 x 1024 - 381K
    viewers03.jpg
    1920 x 1024 - 313K
    viewers04.png
    1920 x 1024 - 521K
    viewers05.png
    1920 x 1080 - 454K
  • SnowSultanSnowSultan Posts: 2,285

    Thanks, I will experiment with those suggestions later. I also don't literally mean there are no artists involved with Blender, some amazing things have certainly been created with it. Many would agree that it's pretty unintuitive though and has been for decades.

    Just curious, there isn't a way to make it recognize the Z axis as forward and back instead of up and down, is there? Yet another thing that make coming to this program from another a pain in the neck.   ;)

  • Mythic3DMythic3D Posts: 1,481

    I still feel like you are are overcomplicating this process. What is the need to export all the different masks from Blender?

    When I do this, I save only two images from Blender: the render, and the composited render showing a separate flat color for each material that hooking up the pick output to the Composite node produces. Then in Photoshop when I want to make adjustments to just one material, I use the magic wand tool to select that material on the cryptomatte export and just click the mask button to add a mask using that selection to whatever layer or group of layers I am working on.

    I don't create and save out a billion masks in Blender - that's a ton of unnecessary work. I just export the render and the cryptomatte composited render and do all my masking right in Photoshop.

  • SnowSultanSnowSultan Posts: 2,285
    edited August 20

    Because the cryptomatte composite is not assigning a different color to each material. It's only using four or five shades of blue and green, and I've had the white areas connect to other white ones, making selections difficult or impossible. If it could make something like the Random viewing option does, where every object is given a unique color to help defining them in complex scenes, I'd definitely do what you're saying.

    I'm still waiting for someone to explain why at least two different programs (Blender and Studio) offer Material ID renders but not anti-aliased ones. That's all I want, an anti-aliased render with every material assigned a slightly different color.

    Post edited by SnowSultan on
Sign In or Register to comment.