MYCAP Studio Facial Motion Capture..?

Melissa ConwayMelissa Conway Posts: 590
edited December 1969 in Art Studio

Has anyone tried MYCAP Studio 2012 Professional with DAZ? The product info page (http://www.rebanrobotics.com/mycap_studio_more.html) doesn't say it works with DAZ specifically, but it does export .bvh files and it's actually affordable...

Comments

  • Bryan SteagallBryan Steagall Posts: 233
    edited December 1969

    Hi Melissa.

    I tried their sample files, and it looks like their strong suit is facial and hand mocap, due to how the data is captured (with 2 webcams). Full body is not going to be very accurate though, and it looks like it would take some processing to be able to get it into DAZ. (the bvh file names of the joints do not correspond to daz joint names)

    It also looks like it just captures the motion of the joints, but doesn't include the movement across a space (translation) like other programs (like IPI), which makes it good for applications like biomechanics (where you are really only looking at rotation data), but not really for animation. If you are only doing simple motions, like standing still and waving or something like that.. it would be ok, but you would have to rename the joints so that you could get usable data.

    Hand and facial seems to be pretty good, but would not translate to DAZ very well (facial animation in DAZ and Poser is morph based, and this data is bone/marker based). It really requires something like Maya or Lightwave or Motionbuilder to bring the data in and apply it to a facial model that is rigged to accept the data it outputs.

    There are other programs, like brekel kinect, that are just as affordable and are easier to work with http://www.brekel.com/brekel-kinect-pro-body/download-trial-buy/
    Brekel allows for two full skeletons, actually captures translation data (limited in space to the what the kinect sensor can see) and works quite well.

    Cheers!

    Bryan

  • Melissa ConwayMelissa Conway Posts: 590
    edited December 1969

    Thanks so much for replying, Bryan. MYCAP doesn't sound user friendly enough for my purposes. :-(

    I'm looking for facial mocap, and was considering Brekel, but I have the XBox version of Kinect and got the impression from the specs on the Brekel site that version wouldn't produce optimal results.

  • Bryan SteagallBryan Steagall Posts: 233
    edited December 1969

    Hi Melissa.. I evaluated it with an xbox version, and it works ok, the "microsoft" version gives you the capability of getting closer to the camera.. which the xbox version doesn't ,but all in all it is still pretty good

    What I did was buy a used xbox kinect from gamestop for about 70 bucks, and once I finished evaluating the software, I returned it. (My need was more for body mocap than facial). You can do that if you feel you want to test it first..

    Unfortunately there is not a lot out there for facial mocap that is relatively inexpensive or that actually works with DAZ.. Brekel is about the least expensive that directly works with a pz2 format.

    There are others, like http://zigncreations.com/products/zign-track-pro/ but it is almost $900.00 (which in the context of what motion capture systems cost, not that bad),

    Maskarad, by Di-o-Matic,is by far the easiest facial mocap system to use (it is a markerless system that uses virtual markers and audio data from a video file) just dropped their price to $1000.00 http://www.di-o-matic.com/products/Software/Maskarad/#page=overview they are not directly compatible with DAZ though (but you could probably copy/paste animation curves from one to the other)

    Honestly, I almost always fall back to using Mimic for my daz/poser animations, then manually keyframe expressions/eye movements, etc

    Cheers

    Bryan

  • Melissa ConwayMelissa Conway Posts: 590
    edited December 1969

    I made the mistake of purchasing iClone and 3DExchange pro thinking I could use it to easily create facial animation that I could then export back to DAZ, but no! Not possible yet, and on the iClone forums, at least, there's no hint as to whether DAZ and iClone will work out the bugs for a future release.

    In the meantime, I was hoping to avoid the Mimic workarounds (downloading the 32-bit version of DAZ and finding the special file that allows me to use it on Genesis). Then again, there's no point investing in a facial mocap program that doesn't capture the lip sync and facial expressions very well. If I have to go in and do a lot of tweaking, might as well do it all by hand in the first place...

  • Bryan SteagallBryan Steagall Posts: 233
    edited December 1969

    Well, unfortunately there is no perfect motion capture system or workflow, every system has its imperfections, even those that cost tens of thousands of dollars. (trust me, I know.. I sell motion capture technologies, from optical to inertial)

    You will always have to tweak in one way or another, adjust animation curves, clean up data before applying to your character, etc.

    It is even worse with facial animation, since there are so many different ways that a character can be rigged for facial animation, (actual bones, morphs or blendshapes, etc) and facial mocap technologies usually rely on a marker based approach. You need to have something in between that can take that marker based data and translate it to morphs if that is what you are using. (like Motionbuilder).

    DAZ, Carrara and Poser also suffer from the stigma of not being considered "professional" software, so they are largely bypassed by mocap hardware and software manufacturers. It has been only recently that independent small companies have targeted this market.

    I know your frustration, I wish I had the programming skills to develop my own plugins

  • AuroratrekAuroratrek Posts: 201
    edited December 1969

    Hi Brian (we corresponded over at Optitrack some time ago). My question, and maybe you know this, but even with a high-end facial mocap, what about tongue movements when talking? That can't be captured, can it?

  • Bryan SteagallBryan Steagall Posts: 233
    edited December 1969

    Hi TV.. I remember...

    Not that I'm aware of (at least, not in a production environment) I have seen a few doing this in research (you can google it and find a few references), and you could place markers on the tongue and create a rigid body to track, but it would be rather.. um.. tricky (Imagine swallowing a marker by mistake? lol), because the markers are almost always hidden from view when talking.

    The only apps that come close to doing anything with the tongue are those that use audio information..(phonemes) like Mimic or Maskarad or Voice-o-Matic (which is like mimic, but for Maya/3ds max/softimage)

Sign In or Register to comment.