Returning to Mixamo, other threads here suggest the way to go is to use Genesis (the original, not Genesis 3 or 8) as that works well with Mixamo. Then save the animation as an animated pose and re-apply to G3 or G8. This thread has a better explanation:
Oh man, if the G1 -> G8 trick really works for retargeting, it just dawned on me that all of that free Truebones stuff should work on G8s as well. That's an impressive Library of high quality mocap and I don't see how this guy Joe makes money if he's giving everything away for free :) If I'm not mistaken, he offers G1/G2 versions of everything. Even if it doesn't, Webanimate could retarget it to G1.
Not everything is free at truebones
G1 as a donor "works" - in the sense that the motion is quite "rough" - even if the donor and target are same proportions, same footwear
I guess people have different definitions of "work"... But you get the expected foot sliding, wobbly motion problems . You need proper retargeting to have it work well
Besides the motion issues - another "quality" issue is the G1 skeleton does not use the twist bones - so even if the the recipient/target G3 or G8 has the motion roughly working from a G1 source - the G3/G8 weight mapping means the joints and bends will be slightly deformed with the same motion
That doesn't surprise me... I just can't see it "just working". Renaming bones so they match is one thing, but doing all the Optimization, Linear Algebra and Quaternions to make it really look correct, even when there are extra bones, or missing bones, or bones of different lengths, is another. I noticed that even Reallusion didn't attempt it and licensed the tech from iKinema (or was it autodesk?). But Daz gives it away for free, and doesn't tell anyone? But I still want to try it... for every year that I'm stuck with Motionbuilder, that's, say, an RTX2080ti that I can't buy.
My gosh. I have a comprehensive set of tutorial for Blender 2.79 I've worked part of the way through. I hope to goodness that the Blender 2.79 keyboard shortcuts are still good in Blender 2.80.
That is all.
The shortcuts are mostly the same , but there are some differences (e.g. in Edit mode, to choose 'vertex/edge/face-select' its now '1/2/3' on the top number row by default vs. the old ctrl-tab and choose - once I got used to it, I prefer it). Others are a little different too but it can be customised.
As for learning Blender 2.8, I too bought a huge Udemy course.
I'm only four videos into that course and may take advantage of their 30-day money back guarantee. I don't mind that the instructor is French, but his lack of English grammar already sounds like fingernails on a chalkboard. In particular, every time he highlights a function, he says, "You have the possiblity to," instead of simply saying, "You can...", and repeatedly hearing the word "possibility" spoken with a French accent has become especially jarring. It's as if he just learned how to speak with the accent and really likes saying that word. I also frequently find myself having to take a moment to process various terms he mentions before I can comprehend what he said. Consequently, I'm not getting nearly as much use out of the course as I'd like.
I'm watching this course now - and although I find it comprehensive and useful...you're so right about "You have the possibility to..." - sometimes 3 sentences in a row. I'm sticking with it as the guy really knows his stuff and I don't find it too distracting (but I think of you every time he says it ).
Additionally, and I don't know if this was possible in 2.79, but in 2.8 I can now translate objects in both global x,y,z directions OR local (up-down,left-right,front-back) according to which way the object's facing. That's gonna help me ;)
Edit: And this is why I'm going through the basic tuts again - I'm fine with general modelling techniques, just need to learn what else I can do or how much better/easier/faster I can do it.
Create your scene in Daz, use the mcjTeleBlender for Blender 2.79 (it works better right now)
I have Blender 2.79 and 2.8
If you are using windows it's all automatic, with Mac you have to load the script and run it. either way it all works.
Sometimes you have to adjust the eye moistures and corneas, but mostly all there.
Save the scene as a blend file, then open it in Blender 2.8
Have fun with Renders Lighting etc.
Quick question because I asked mCasual the same thing: is it necessary to convert all the materials to the DAZ 3Delight default shader before using mcjTeleBlender or do you just send the scene with IRay shaders? Casual advised to convert but I've seen others mention that IRay is better.
By the way, he is working on version 4 now specifically for Blender 2.8 - there's a link to the beta in his Freebie thread.
I use the materials "as is" then apply any adjustments in Blender using the nodes etc.
Forgive me if this has been mentioned before, (I can’t imagine it hasn’t) but I came across this somewhere else and figured if I don’t mention this now, I forget... It’s ProRender for AMD Radeon, A free Physically Based Renderer... it has a plugin for Blender... seems like AMD’s Iray.
Oh dear, not another render engine lol...... Will have to give this a go soon, see how it measures up to cycles. So far, the tests I have done in cycles(with daz characters) have left me underwhelmed compared to iray as far as speed is concerned.
Forgive me if this has been mentioned before, (I can’t imagine it hasn’t) but I came across this somewhere else and figured if I don’t mention this now, I forget... It’s ProRender for AMD Radeon, A free Physically Based Renderer... it has a plugin for Blender... seems like AMD’s Iray.
As for learning Blender 2.8, I too bought a huge Udemy course.
I'm only four videos into that course and may take advantage of their 30-day money back guarantee. I don't mind that the instructor is French, but his lack of English grammar already sounds like fingernails on a chalkboard. In particular, every time he highlights a function, he says, "You have the possiblity to," instead of simply saying, "You can...", and repeatedly hearing the word "possibility" spoken with a French accent has become especially jarring. It's as if he just learned how to speak with the accent and really likes saying that word. I also frequently find myself having to take a moment to process various terms he mentions before I can comprehend what he said. Consequently, I'm not getting nearly as much use out of the course as I'd like.
I'm watching this course now - and although I find it comprehensive and useful...you're so right about "You have the possibility to..." - sometimes 3 sentences in a row. I'm sticking with it as the guy really knows his stuff and I don't find it too distracting (but I think of you every time he says it ).
Does anyone happen to have any links to setting up opacity/transparency maps in Blender 2.8? I found a couple of videos, but they seem overly complicated and I wonder if there's an easier way just to apply transmaps for things like eyelashes, leaves, and such.
Does anyone happen to have any links to setting up opacity/transparency maps in Blender 2.8? I found a couple of videos, but they seem overly complicated and I wonder if there's an easier way just to apply transmaps for things like eyelashes, leaves, and such.
If you're using the Principled BSDF there is an alpha channel. where you can use the inverted map from cutout oppacity.
Does anyone happen to have any links to setting up opacity/transparency maps in Blender 2.8? I found a couple of videos, but they seem overly complicated and I wonder if there's an easier way just to apply transmaps for things like eyelashes, leaves, and such.
If you're using the Principled BSDF there is an alpha channel. where you can use the inverted map from cutout oppacity.
And if you're not using the Principled BSDF for whatever reason, it's almost as easy: just use a mix shader at the end of your node tree to mix a Transparent BSDF with the output from the rest of the tree using the trans map as the factor. Note in EEVEE, you need to set the Blend Mode to 'Alpha Hashed' in the Settings for the material.
Felis, I connected the opacity map and the textures actually render the black areas of the map. Do you have to use a different type of node for the opacity map? Sorry if that's confusing, I'm trying to multitask and talk to somone at the same time. ;)
add > color > invert, plug the map into that, then the output to the mixer. Or you could just use photoshop or whatever to invert the colors on the map.
Could you possibly show a screenshot of how it should look? I'm not doing something right and I can't narrow down what the issue is.
EDIT: I did get it to work this way, does this look normal to those of you more experienced with Blender? Also, can you not use the full BDSF shader when you need to use transparency that comes from a separate map (not the transparency of a PNG)? Thanks for your help.
I'm also seeking some advice from those who know about materials, etc.
How well does Substance Painter work with Blender? I stumbled across something called Live Link which seems like a Blender <-> Substance Painter bridge.
I have to say that teh Blender node system still scares the bejeezes out of me but Substanc Painter looks like something I might be able to get my head around.
Another question I have is, does Blender support UDIMs? Or does Substance Painter, for that matter? Is it possible to paint directly on to the model without having the limitation of seams (that's probably a nonsense question - showing how little I know aboy materials and textures).
Substance Painter doesn't yet support UDIM far as I know and Blender won't until version 2.81. There is a paid plugin for Blender where you can save PBR materials (the kind SP makes) easily. I can't link it because of Daz's TOS on paid stuff (even stuff they don't sell). You'll have to Google it.
Could you possibly show a screenshot of how it should look? I'm not doing something right and I can't narrow down what the issue is.
EDIT: I did get it to work this way, does this look normal to those of you more experienced with Blender? Also, can you not use the full BDSF shader when you need to use transparency that comes from a separate map (not the transparency of a PNG)? Thanks for your help.
Have a look at the screenshot I posted earlier. Substitute your trans map for the checker texture, and feed your diffuse image map into the color input of the Diffuse BSDF. Mix shaders are designed to work best when mixing shaders, rather than image maps although clearly it is doing something with your diffuse map. (You will need other nodes e.g. Glossy BSDF in the node set up to make it look better. Mix in the trans map as the last step before the output.)
Or, per Felis, feed your diffuse map into the base color input of a Principled BSDF, and your trans map into the alpha input. You may not need any other nodes, just fiddle with the sliders on the Principled BSDF until you get the look you want.
Sorry can't screenshot right now, Blender machine not available in airport departure lounge!
Done my tutorials, here's a first render in 2.8 had to make all the shaders myself so carpaint shaders are a bit a bit of noise in them tried putting up the samples made it better. The denoiser is a bit to hash to use. I guess I will get better. Used the DAZ HDR map for the environment lighting.
As for learning Blender 2.8, I too bought a huge Udemy course.
I'm only four videos into that course and may take advantage of their 30-day money back guarantee. I don't mind that the instructor is French, but his lack of English grammar already sounds like fingernails on a chalkboard. In particular, every time he highlights a function, he says, "You have the possiblity to," instead of simply saying, "You can...", and repeatedly hearing the word "possibility" spoken with a French accent has become especially jarring. It's as if he just learned how to speak with the accent and really likes saying that word. I also frequently find myself having to take a moment to process various terms he mentions before I can comprehend what he said. Consequently, I'm not getting nearly as much use out of the course as I'd like.
Ahh, that's disappointing. I am on holiday so have not had chance to start working through the course yet. I know what you mean about jarring repetition - I feel similarly about the phrase "go ahead and ..." which seems to be universal in video tutorials, being repeated endlessly. Another is "verticee" as the imagined singular of vertices. Please people, it is VERTEX!
Nevertheless I'll give it a try although, now that I am aware of it, the accented and repetitive "possibility" will no doubt test my tolerance considerably (though I generally love the French accent).
I was taught vertice (ver-teh-see) is the plural of vertex, not vertices but I guess almost universal incorrect usage has changed it to vertices or my teacher was wrong.
I'd be interested if you could find anyone to agree with your teacher. A google search only turned up Vertex (singular) and Vertices (plural).
Vertex is an angle in geometry... whereas vertices are the points that make up said angle. English being what it is, its often stated to be in singular formation as "vertice" (ie: I selected a single vertice) but in technical geometric terms, its actually "I selected a single vertex" If u ask me, I think vertice should be acceptable... I mean, in English it makes sense lol But math is meeeeeeeeeeeeh like that.
But all that really isnt why I came here lol
Ive been dabbling a wee bit with 2.8... and wow, its pretty cool. Ive had blender for awhile, just as a backup to Modo and Silo... but I never used it coz dear god, that ui. I could never really figure out how to do anything in it. But now... its leaps and bounds better. Its become a pretty solid modeler. The eevee renderer is pretty awesome too.
Could you possibly show a screenshot of how it should look? I'm not doing something right and I can't narrow down what the issue is.
EDIT: I did get it to work this way, does this look normal to those of you more experienced with Blender? Also, can you not use the full BDSF shader when you need to use transparency that comes from a separate map (not the transparency of a PNG)? Thanks for your help.
If you still need a screenshot, here is a very basic setup, with the diffuse map and the transparency map (here inverted).
Could you possibly show a screenshot of how it should look? I'm not doing something right and I can't narrow down what the issue is.
EDIT: I did get it to work this way, does this look normal to those of you more experienced with Blender? Also, can you not use the full BDSF shader when you need to use transparency that comes from a separate map (not the transparency of a PNG)? Thanks for your help.
If you still need a screenshot, here is a very basic setup, with the diffuse map and the transparency map (here inverted).
I am not sure about the inversion of the trans map. I have not found this necessary in Blender 2.80, where black is rendered transparent and white is opaque as far as I can tell. That is the same as in DS, at least for Iray and cutout opacity.
I am not sure about the inversion of the trans map. I have not found this necessary in Blender 2.80, where black is rendered transparent and white is opaque as far as I can tell. That is the same as in DS, at least for Iray and cutout opacity.
You're right.
I remembered the alpha channel in the principled BSDF as being inverse of DS, but either they have changed it, or my memory is just wrong.
I am not sure about the inversion of the trans map. I have not found this necessary in Blender 2.80, where black is rendered transparent and white is opaque as far as I can tell. That is the same as in DS, at least for Iray and cutout opacity.
You're right.
I remembered the alpha channel in the principled BSDF as being inverse of DS, but either they have changed it, or my memory is just wrong.
Yes, I seem to remember at some point in the past that I would have to invert trans maps. I may actually have given the same advice previously. I don't know when it changed. Perhaps it was the now defunct Blender internal that treated black and white the other way about.
Thank you for the additional information. I'm still not quite getting it based on those screenshots because I don't see the transmap in Andya's examples (and plugging it right into the Alpha of the BSDF node isn't working here), but I will experiment more based on this advice. Much appreciated.
EDIT: OK I figured a few things out; you have to set a blend mode in the side panel for transparency to work with Eevee and plug the opacity map's color channel into the BSDF's Alpha even though the colored dots are different. I'm not sure why the shadows are ten times darker with Cycles than Eevee, but I'm sure there's a checkbox buried somewhere in these panels that will adjust them.
Thank you for the additional information. I'm still not quite getting it based on those screenshots because I don't see the transmap in Andya's examples (and plugging it right into the Alpha of the BSDF node isn't working here), but I will experiment more based on this advice. Much appreciated.
EDIT: OK I figured a few things out; you have to set a blend mode in the side panel for transparency to work with Eevee and plug the opacity map's color channel into the BSDF's Alpha even though the colored dots are different. I'm not sure why the shadows are ten times darker with Cycles than Eevee, but I'm sure there's a checkbox buried somewhere in these panels that will adjust them.
Sorry, I was 'faking' the trans map by using a checker texture in my earlier post, then just switching a mix between black and white. There's no image as such, but the principle is the same.
The color of the input/output dots is a good guide, but not absolute. As you found, you can use color output (yellow) as input to a grey dot sometimes. I'm not sure what the rules are exactly, I just experiment. I guess in this case, grayscale is a subset of color, probably just calculates a shade of grey from the RGB values, so it's OK.
EEVEE shadows do need some tweaking, in the Render panel. They're calculated differently from Cycles, of course, and less 'accurate'. Sometimes I find you have to move the view, or select a light to get the viewport to update and reflect changes.
Thanks, yeah as long as I know there are differences between how Eevee and Cycles render, it's fine. I'd render the final pic in Cycles anyway.
I do have one more question if it's OK, because this will probably determine whether I should bother with learning Blender at all. Can you render alpha channels for individual surfaces? I absolutely rely on doing this in Studio for postwork in Photoshop, and if Blender can only render alpha channels for whole objects, it won't be enough for what I need.
Vertex is an angle in geometry... whereas vertices are the points that make up said angle. English being what it is, its often stated to be in singular formation as "vertice" (ie: I selected a single vertice) but in technical geometric terms, its actually "I selected a single vertex" If u ask me, I think vertice should be acceptable... I mean, in English it makes sense lol But math is meeeeeeeeeeeeh like that.
On one hand, it doesn’t really matter whether it makes sense in English, because neither vertex not vertices are “English” words. Then again, English is funny about when it does and doesn’t respect the rules of whichever language it’s importing. People use words like “alumni” and “agenda” as singular words all the time, even though they’re inherently plural, and there’s no such thing as a homo sapien, because “sapiens” IS the singular form of the word.
I dated a Japanese woman, and one time she told me that she had “a bad news”. I told her that that doesn’t really work, and she just say she has bad news. She turned that over in her head a few times, then tried “I have a bad new?”. It makes sense according to the basic rules of English, but it doesn’t quite work, does it?
I just can't see it "just working". Renaming bones so they match is one thing, but doing all the Optimization, Linear Algebra and Quaternions to make it really look correct, even when there are extra bones, or missing bones, or bones of different lengths, is another. I noticed that even Reallusion didn't attempt it and licensed the tech from iKinema (or was it autodesk?). But Daz gives it away for free, and doesn't tell anyone? But I still want to try it...
Reallusion uses the Maya Human IK system from Autodesk.
The 3Dxchange retargeting system in Iclone pro is every bit as powerful as Motionbuilder for retargeting to non matching skeletons and being able to sSave the retargeting template later use.
As was Stated, the G1 to G3/G8 trick is a decent "last resort" work around to get rough,base layer ,legacy bodymotion onto a G3/8 figure in Daz studio if you have no access to proper retargeting tools like Iclone, Ikinema or MOBU.
However be prepared to use the graph editor in Daz studio to globally fix hands and other limb parts that "go astray"
The new IK system, in the 4.12 beta ,could help alot with foot slide after some manual setup , I am given to understand.
I personally have no use for it as I use Iclone pro and bake my IK to FK upon export to BVH .
probably asekd and answered in here but anyone know a video where I can see how to work in 2.8 after being forced into 2.79 until two weeks ago becuase my old GPU was poo and not supported?
Comments
That doesn't surprise me... I just can't see it "just working". Renaming bones so they match is one thing, but doing all the Optimization, Linear Algebra and Quaternions to make it really look correct, even when there are extra bones, or missing bones, or bones of different lengths, is another. I noticed that even Reallusion didn't attempt it and licensed the tech from iKinema (or was it autodesk?). But Daz gives it away for free, and doesn't tell anyone? But I still want to try it... for every year that I'm stuck with Motionbuilder, that's, say, an RTX2080ti that I can't buy.
The shortcuts are mostly the same , but there are some differences (e.g. in Edit mode, to choose 'vertex/edge/face-select' its now '1/2/3' on the top number row by default vs. the old ctrl-tab and choose - once I got used to it, I prefer it). Others are a little different too but it can be customised.
I'm watching this course now - and although I find it comprehensive and useful...you're so right about "You have the possibility to..." - sometimes 3 sentences in a row. I'm sticking with it as the guy really knows his stuff and I don't find it too distracting (but I think of you every time he says it ).
Additionally, and I don't know if this was possible in 2.79, but in 2.8 I can now translate objects in both global x,y,z directions OR local (up-down,left-right,front-back) according to which way the object's facing. That's gonna help me ;)
Edit: And this is why I'm going through the basic tuts again - I'm fine with general modelling techniques, just need to learn what else I can do or how much better/easier/faster I can do it.
For the adventurous, but hey....UDIMS! Warning: audio is super crappy ;).
https://youtu.be/fuldUVJ-FBU
Laurie
I use the materials "as is" then apply any adjustments in Blender using the nodes etc.
Forgive me if this has been mentioned before, (I can’t imagine it hasn’t) but I came across this somewhere else and figured if I don’t mention this now, I forget... It’s ProRender for AMD Radeon, A free Physically Based Renderer... it has a plugin for Blender... seems like AMD’s Iray.
ProRender:
https://www.amd.com/en/technologies/radeon-prorender
ProRender for Blender:
https://www.amd.com/en/technologies/radeon-prorender-blender
Oh dear, not another render engine lol...... Will have to give this a go soon, see how it measures up to cycles. So far, the tests I have done in cycles(with daz characters) have left me underwhelmed compared to iray as far as speed is concerned.
And it is for Mac, too, but AI denoising only on CPU.
Sorry...! LOL
Does anyone happen to have any links to setting up opacity/transparency maps in Blender 2.8? I found a couple of videos, but they seem overly complicated and I wonder if there's an easier way just to apply transmaps for things like eyelashes, leaves, and such.
If you're using the Principled BSDF there is an alpha channel. where you can use the inverted map from cutout oppacity.
And if you're not using the Principled BSDF for whatever reason, it's almost as easy: just use a mix shader at the end of your node tree to mix a Transparent BSDF with the output from the rest of the tree using the trans map as the factor. Note in EEVEE, you need to set the Blend Mode to 'Alpha Hashed' in the Settings for the material.
Felis, I connected the opacity map and the textures actually render the black areas of the map. Do you have to use a different type of node for the opacity map? Sorry if that's confusing, I'm trying to multitask and talk to somone at the same time. ;)
add > color > invert, plug the map into that, then the output to the mixer. Or you could just use photoshop or whatever to invert the colors on the map.
Could you possibly show a screenshot of how it should look? I'm not doing something right and I can't narrow down what the issue is.
EDIT: I did get it to work this way, does this look normal to those of you more experienced with Blender? Also, can you not use the full BDSF shader when you need to use transparency that comes from a separate map (not the transparency of a PNG)? Thanks for your help.
I'm also seeking some advice from those who know about materials, etc.
How well does Substance Painter work with Blender? I stumbled across something called Live Link which seems like a Blender <-> Substance Painter bridge.
I have to say that teh Blender node system still scares the bejeezes out of me but Substanc Painter looks like something I might be able to get my head around.
Another question I have is, does Blender support UDIMs? Or does Substance Painter, for that matter? Is it possible to paint directly on to the model without having the limitation of seams (that's probably a nonsense question - showing how little I know aboy materials and textures).
Substance Painter doesn't yet support UDIM far as I know and Blender won't until version 2.81. There is a paid plugin for Blender where you can save PBR materials (the kind SP makes) easily. I can't link it because of Daz's TOS on paid stuff (even stuff they don't sell). You'll have to Google it.
Laurie
Have a look at the screenshot I posted earlier. Substitute your trans map for the checker texture, and feed your diffuse image map into the color input of the Diffuse BSDF. Mix shaders are designed to work best when mixing shaders, rather than image maps although clearly it is doing something with your diffuse map. (You will need other nodes e.g. Glossy BSDF in the node set up to make it look better. Mix in the trans map as the last step before the output.)
Or, per Felis, feed your diffuse map into the base color input of a Principled BSDF, and your trans map into the alpha input. You may not need any other nodes, just fiddle with the sliders on the Principled BSDF until you get the look you want.
Sorry can't screenshot right now, Blender machine not available in airport departure lounge!
Done my tutorials, here's a first render in 2.8 had to make all the shaders myself so carpaint shaders are a bit a bit of noise in them tried putting up the samples made it better. The denoiser is a bit to hash to use. I guess I will get better. Used the DAZ HDR map for the environment lighting.
Vertex is an angle in geometry... whereas vertices are the points that make up said angle. English being what it is, its often stated to be in singular formation as "vertice" (ie: I selected a single vertice) but in technical geometric terms, its actually "I selected a single vertex" If u ask me, I think vertice should be acceptable... I mean, in English it makes sense lol But math is meeeeeeeeeeeeh like that.
But all that really isnt why I came here lol
Ive been dabbling a wee bit with 2.8... and wow, its pretty cool. Ive had blender for awhile, just as a backup to Modo and Silo... but I never used it coz dear god, that ui. I could never really figure out how to do anything in it. But now... its leaps and bounds better. Its become a pretty solid modeler. The eevee renderer is pretty awesome too.
If you still need a screenshot, here is a very basic setup, with the diffuse map and the transparency map (here inverted).
I am not sure about the inversion of the trans map. I have not found this necessary in Blender 2.80, where black is rendered transparent and white is opaque as far as I can tell. That is the same as in DS, at least for Iray and cutout opacity.
You're right.
I remembered the alpha channel in the principled BSDF as being inverse of DS, but either they have changed it, or my memory is just wrong.
Yes, I seem to remember at some point in the past that I would have to invert trans maps. I may actually have given the same advice previously. I don't know when it changed. Perhaps it was the now defunct Blender internal that treated black and white the other way about.
Thank you for the additional information. I'm still not quite getting it based on those screenshots because I don't see the transmap in Andya's examples (and plugging it right into the Alpha of the BSDF node isn't working here), but I will experiment more based on this advice. Much appreciated.
EDIT: OK I figured a few things out; you have to set a blend mode in the side panel for transparency to work with Eevee and plug the opacity map's color channel into the BSDF's Alpha even though the colored dots are different. I'm not sure why the shadows are ten times darker with Cycles than Eevee, but I'm sure there's a checkbox buried somewhere in these panels that will adjust them.
Sorry, I was 'faking' the trans map by using a checker texture in my earlier post, then just switching a mix between black and white. There's no image as such, but the principle is the same.
The color of the input/output dots is a good guide, but not absolute. As you found, you can use color output (yellow) as input to a grey dot sometimes. I'm not sure what the rules are exactly, I just experiment. I guess in this case, grayscale is a subset of color, probably just calculates a shade of grey from the RGB values, so it's OK.
EEVEE shadows do need some tweaking, in the Render panel. They're calculated differently from Cycles, of course, and less 'accurate'. Sometimes I find you have to move the view, or select a light to get the viewport to update and reflect changes.
Thanks, yeah as long as I know there are differences between how Eevee and Cycles render, it's fine. I'd render the final pic in Cycles anyway.
I do have one more question if it's OK, because this will probably determine whether I should bother with learning Blender at all. Can you render alpha channels for individual surfaces? I absolutely rely on doing this in Studio for postwork in Photoshop, and if Blender can only render alpha channels for whole objects, it won't be enough for what I need.
On one hand, it doesn’t really matter whether it makes sense in English, because neither vertex not vertices are “English” words. Then again, English is funny about when it does and doesn’t respect the rules of whichever language it’s importing. People use words like “alumni” and “agenda” as singular words all the time, even though they’re inherently plural, and there’s no such thing as a homo sapien, because “sapiens” IS the singular form of the word.
I dated a Japanese woman, and one time she told me that she had “a bad news”. I told her that that doesn’t really work, and she just say she has bad news. She turned that over in her head a few times, then tried “I have a bad new?”. It makes sense according to the basic rules of English, but it doesn’t quite work, does it?
Reallusion uses the Maya Human IK system from Autodesk.
The 3Dxchange retargeting system in Iclone pro is every bit as powerful as Motionbuilder for retargeting to non matching skeletons and being able to sSave the retargeting template later use.
As was Stated, the G1 to G3/G8 trick is a decent "last resort" work around to get rough,base layer ,legacy bodymotion onto a G3/8 figure in Daz studio if you have no access to proper retargeting tools like Iclone, Ikinema or MOBU.
However be prepared to use the graph editor in Daz studio to globally fix hands and other limb parts that "go astray"
The new IK system, in the 4.12 beta ,could help alot with foot slide after some manual setup , I am given to understand.
I personally have no use for it as I use Iclone pro and bake my IK to FK upon export to BVH .
probably asekd and answered in here but anyone know a video where I can see how to work in 2.8 after being forced into 2.79 until two weeks ago becuase my old GPU was poo and not supported?