Anyone watch Daz's animated videos?
Sorel
Posts: 1,389
All the videos advertise model, render, animate with software for daz, but what which program can inflate a figure and make it pop? Or make a procedurally growing tree? Are these hints of things we'll be able to do eventually?
Post edited by Sorel on
Comments
Unity can do those things. Blender too.
Those aren't daz programs though?
No, but you didn't say it had to be a DAZ 3D program. Anyway, Bryce & Carrara can do a bit of what you talk about but whether they or DAZ Studio are modernized to do what Blender, Unity, UE4, and so on can do has lots of conjecture in the forums but no confirmation at all from DAZ 3D what those are goals.
The inflating elephant looks like someone just animated the Y Scale parameter. It's not a real inflation simulation.
I just gotta say....the clown video was creepy. LMAO
Laurie
The liquid physics I believe was done with this plugin:
https://www.daz3d.com/fluidos-for-daz-studio
and the rest was done in DAZ Studio from the looks of it. Inflating the elephant was running a dForce simulation in reverse.
the fluidOS is the only new thing in those animations & it costs extra and is not a DAZ Original product.
Blender??...likely
Unity..? NO ....not really
There are some incredibly high quality videos being RENDERED in Unity& unreal's realtimes engines.
Most notably the three part "Adam"series from Unity.
Understand this however,
These are realtime playback engines for rendering pre animated imported assets either via game controllers for gaming
or the incredible "Cinemachine" procedural camera system in unity.
The "Adam" series was created with a combination of pre-animated rigs imported as FBX from MAYA for the mech
and Alembic Caches for the humans made from 3D scans of real life mocap actors.
PLEASE go and watch the "Making of" YT videos for the "Adam"series, and see for yourself
There are no "Character animation" creation tools in Unity per say
only means to control Characters animated elsewhere and imported.
Thus anyone need a Character inflating /exploding will need to create this Visual effect in a true 3DCC program and import it to unity or unreal as alembic for realtime rendering after setting up your lighting material cameras etc etc.
No different from what We operators who use (for example ) Daz studio or Iclone to animate a Daz genesis character.
and then export him as FBX or MDD/Alembic caches to C4D,LightWave3D, Blender etc to use their render engines.
Now, the new Universal scene description format from pixar(USD)
will soon make these asset tranfers more compatible in the area of texture conversion but you still need to create your Character animations in a program that has such Motion building tools.
Unity has None.
Daz software has come along way, Daz can do simulated water & Deforce for dynamic clothing and hair. animated Dust and light effects So all things are possible with newer versions Daz studio, I always have to remind myself that Daz software relies more on third party plugins to make the software more usable for animation , than the daz developers making all the little specialty features ,even animate2 is third party plugin and is really needed for animation with daz . its been that way since i started with daz 8 years ago. . Which great for venders and 3d artist selling content here. Bad for people that is try to keep up making the software work for animation every time daz releases a new version of the software. only because it seems like daz likes to breaks things in new daz releases where the stuff worked in the older versions.
I findit fun to make animation with Daz , I completely do all my animations in Daz Studio even for my effects as you listed , I do not use blender or other programs to render my animated scenes. How ever I will use blender to make or modify daz content such as reducing poly's or remaking a room model set so the door and windows can open. and I will use a film editor to add in post work effects
Other wises doing animation is fun with daz .. its not easy takes time to learn the Daz software like anything else. .. if you want drop and drag animations software then you should go with mixamo or Simms , I think Iclone has become pretty much drop and drag 3d as well . if you require more robust 3d animation software then blender or Maya will properly be better options
But that is just my opinion Other opinions will vary
This is not really true. Unity has plenty of awesome animation tools. In fact, I have decided to use Unity tools over Maya. The main reason is the Cinemachine camera animation asset you mentioned but as for character animation there is this: https://assetstore.unity.com/packages/tools/animation/umotion-pro-animation-editor-95991 and this: https://assetstore.unity.com/packages/tools/animation/final-ik-14290 and especially this: https://assetstore.unity.com/packages/tools/physics/puppetmaster-48977 , which is something even Maya doesn't have but I believe you are doing something similar with Endorphin. It is because of these tools and more, that I am creating my animation mostly in Unity and transferring it over to Maya for simulations and rendering. Combine that with the fact that every mocap hardware package has links into Unity and it makes it a great way to animate. Much faster to iterate and produce animation procedurally. Seeing that the animation stage is the bottleneck in the whole pipeline, I can see many other one-man studios using gaming platforms in their workflow. I think Maya has to do a little catching up to the realtime animation methods in Unity /UE4
Unity and UE are really getting alot of VFX/3DCC companies making their tools available for use inside Unity/UE without even having to leave it. Houdini Engine from SideFX is one example, where you can do simulations and procedural geometry creation using Houdini inside the Unity and UE editors without having to go through the cumbersome export/app switch/import/export/app switch/import workflow. Octane in Unity is another example.
OK, so it's a lot easier to use Blender & import but there are animation editor addons in the Unity Asset Store and even rigging tools.
1) Skele: Character Animation Tools by TMPxyz $50
2) Puppet3D by Puppetman $50 (rigging / weighpainting)
3) Umotion Pro by Soxware Interactive - $60 - is really quite capable
4) FinalIK by RootMotion $90 - procedural runtime IK solution for games (could procedural save those procedural animations out to disk by the way)
5) PuppetMaster by RootMotion $90 - advance active ragdoll physics which alway could be saved in animation formats procedurally
Also consider that for typical humanoid biped animations there are really a multiplicity of animations for every type of movement your character could do if you are talking running, walking, and jumping and many available free in the Unity Asset Store. You'd only really need to creature animations for things like opening doors and other activities peculiar to your game and animation.
Unity is also expanding the animation capabilities of it's editor and those features with be free once completed.
Of course Blender is free and can already do all the animation bits.
The RootMotion fellow is really quite a capable mathematician and physicist training which he has deftly used with his excellant programming skills too.
Thanks Nonesuch & DRzap,
I stand corrected, as I should have stated ,to the OP, that the Default free unity has no advanced Character animation creation tools or ability to explode a character without purchasing over $300 USD in addons
IMHO However the free Game changing cinemachine procedural Camera system and free timeline for nonlinear editing of Alembic Caches, would more than compensate for having to buy those addons for Character base animated filmaking in unity
Reallusion leased the Maya human IK solver from the Mighty Autodesk, so its IK system is the same as Maya and Motionbuilder.
when you import a Maya humanIK rig into Iclone 3DXchange.
it is auto recognized and instantly Characterized to accept Imotion Data from Iclone with one click.
Drag and drop saves alot of time blocking out base locomotions.
Iclone Does have a procedural motion clip based system as well as a spline graph editor and Dope sheet,Auto lipsynch from audio and Phoneme editor for lipsinc refinement,and natural auto eye blinking.
Unlike Daz aniMate2,Iclone's motion clip recording system allows easy layering of hand keyed FK on top of your Solved IK at any point
(not that daz has any solved IK)
And its walk paths can be project to uneven terrains enabling your Characters to easliy walk up & Down inclines/declines .
And it has many tools design specificlly to correct the foot sliding that you see in all Daz animations particularly when the weight is shifted from one side of the body to the other.
I am not aware of any blender options to create lipsinc automaticlyfrom audio other than some some third party Facial capture plugins
that use you web cam on a specially prepared face rig.
However when a stable 2.8 is in full release I predict you will see a flood of new blender add ons for every aspect of Character animation/filmmaking.
Unity's free Kinematica as part of the Unity Editor will slice & dice existing animations and recombined them into fluid new animations. So say you manually create an animation for multiple characters that go and visit a restaurant, sit down, order and eat while be served by wait staff then Kinematica will let you give that animation data and it can be used to feed similar behavior in a game. Now say that you want to create an animation in a new environment, say a home, that is similar to the restaurant's action. You don't need to reanimate the entire scene, justcreate game play it as if it was an animation and procedurally save that animation data.
https://blogs.unity3d.com/2018/06/20/announcing-kinematica-animation-meets-machine-learning/
So that will be a huge time save for animators in the future.
Indeed this is an exciting time to be a character animator
as there are many new tools that are not only scalable for team environments
but can be used to add efficency to the pipelines of us one man operations as well.
Yeah, I'm really excited about upcoming animation tools for Unity. Especially the machine learning procedural animation technologies. This makes possible easy locomotion animation (which can be as much as 1/4 of all animations in a project) where one can animate in realtime by just playing a "game" with their favorite controller, recording the gaming session as an fbx file and retargeting to a character in your desired DCC application or just render it out in realtime. This is a game changer. Animating car chases or dogfight sequences can be done by just playing a game. No more tedious walk cycles or hand keying on path splines and with the new AI procedural tools, no more even animating impact reactions, stumbling, falling, or any of the natural bipedal movements that make hand animation so costly. This frees the artist up for final authoring and polishing and saves a ton of time. I'm convinced that this is the future - https://www.deepmotion.com/ Coming soon to Unity / UE4.
Yes all it takes is money for all those wonderful tools
Hi Dzap, did you know every link you posted in this thread gives a 404 can't be found error. at least for me it did
thats the DAZ forum software
does it to my links often too, maybe try removing the end slash
https://www.deepmotion.com
nup
https://www.deepmotion.com/%C2%A0 DAZ does this to the link
I had to use source in the forum html editor to fix it
Oh, sorry about that. Search the Unity app store for these assets:
Final IK
Puppetmaster
UMotion Pro
Those are the tools I was referring to.
Edit: Here is a link to a youtube test video of a Daz figure (imported into Maya with the Genesis 8 for Maya plugin) with retargeted raw mocap from iPiSoft. Now imagine you could cut this mocap up into pieces defining different kinds of actions. Then you could assign these pieces to buttons on a game controller. In Unity, you could use your controller to walk your character thru a scene while pushing buttons for appropriate actions at the proper place and time. The software will blend those actions smoothly and naturally. After you're finished, you can export the whole thing as a motion fbx file and retarget in any app, including Daz Studio. You wouldn't have set a single keyframe or messed with a single animation curve. This is what I call freedom.
Link: https://youtu.be/UzEb1kefcEY
Thanks for the corrections guys I was clicking the links for the deep motions which was what I wasx interested in to learn about it and was getting bad errors. got it now
Actually this ability has existed at least since 2012
We call them "persona"files.
Imagine having a unlimited number of predefined motions& behaviours from disparate sources (mocap hardware,ragdoll from endorphin ,various idles walks & runs etc).
Yet instead of having to load them into a nonlinear motion track
each time as aniblock, you can store them all in one "Persona" file that can be embedded into any realtime character and will be neutral until accessed via a contextual right click menu.
and you determine via mouse click, the location in the scene to which you want the character to ambulate.
At any give point, in your motion development you can right click
and have the Character smoothly transition into an idle, talking gesture or fall back from a punch in the face
Now each time he/she performs a right clicked "persona" motion it is recorded to a new nonlinear timeline clip that can be further refined/editeduntil you are ready to either export the new data as BVH to Daz etc. or export the character himself as FBX or Alembic to any other 3DCC/VFX package for final render.
Here is a short example of a character being driven by his embedded
"persona" files and exported to Daz genesis BVH Data for Daz studio.
Here is that same Avatar with a simple Idle persona Motion exportedas Alembic to Lightwave3D for quick preview render using Lightwaves superfast VPR( viewport renderer)
" We call them "persona"files.
Imagine having a unlimited number of predefined motions& behaviours from disparate sources (mocap hardware,ragdoll from endorphin ,various idles walks & runs etc). "
Yup, Realillusions persona files was an early example of autokeyframing. But it's absolutely clumsy compared to what we have now in AI learned motion. With previous technologies, we were limited to the motion files that we fed our avatars and it wasn't in realtime. Now, avatars can learn new motions and can behave almost independently depending on environmental stimuli and all in realtime. Instead of using dumb ragdolls, that just know how to fall dramatically, now we are on the cusp of having characters that have novel reactions to contact, adapt to changing conditions, learn to stabilize themselves, and even catch themselves as they are falling, all without the animators input. We don't need to be limited to just the animations we feed the avatar. What Realillusion started with persona files, AI has taken it the next few steps toward realism and ease of animation. The best part is AI technology doesn't stop with just locomotion animation but it will be used in all aspects of filmmaking. This is just the beginning. We have game engines like Unity to thank for this. Because of their necessity for creating massive animation under tight restrictions, all types of animators can take advantage of their progress.
The behaviours you just described, have existed since the Peter Jackson's "Lord of the Rings" Films were made with huge battle for the Hornberg ,scene created by the proprietary "Massive" software at Weta Digital sixteen years ago.
As usual.. It has taken this long for graphics hardware technology to become affordable enough for users, outside of those large VFX, houses to take advantage of these advanced software algorithums.
And like everything else graphic related, from frame rates to monitor/TV resolutions/delivery formats
the Video game industry has lead the way and set the standards
You and I both know that Affordable AI will soon make being a
"Character animator" obsolete.... and this is a bloody good thing
because the current character animation tools in every major 3DCC package are merely a cumbersome and time sucking hinderance to the visual story telling process.
This is why I am still running so much old 3DCC software
because all of the "new" versions still rely on 20 year old nearly masochistic approaches in dealing with Character motion.
I still run C4D R11.5 from 9 years ago yet C4D R20 (released this past summer)
still cannot even import and retarget a BVH file with the same ease that even poser has been doing for over 20 years
or solve dynamic clothing on a moving figure Like Daz Dforce ,poser or Iclone cloth physics.
Yet I recently had a Maxon C4D loyalist at another web forum,
tell me I am not a "real animator" because I advocate the use of facial mocap or at a minimum audio based auto lipsinc over hand curating my talking Characters lipsinc with manual key framing.. like they do at Pixar
Sure We all love Pixar however I have to wonder why we had to wailt over a Decade for a sequel to "The Incredibles"
Being a "real animator" is bloody exhausting.
I want to be a virtual Movie Director.
"I want to be a virtual Movie Director."
Amen!
'virtual movie director' is exactly the direction Unity 3D and, although I don't follow them, UE4 & other gaming engines are going in.