DAZ Director Plugin (WIP)
sidcarton1587
Posts: 78
So as I mentioned in a different post, I've started tinkering with integrating DAZ Studio with AI-assist to define the art of the possible. What I want to be clear about up-front is that this is not about AI image generation, and is not related to the DAZ AI Studio product released by DAZ.
What I am talking about is using AI to help Studio users accelerate their workflows where they need the accelerating. The AI in this case becomes your "gopher" (agent in AI parlance), to get you past tedious tasks so you can focus on what's most interesting to you. That being said, what's tedious to you might be the most enjoyable part to someone else. So this tool would not be intended to replace the existing Studio interface, but support it.
Anyway, let's get down to brass tacks. What are we talking about exactly?
Here's the "marketing" literature response:
DAZ Director connects an AI — specifically Claude but we can also use other models or local and public models— directly to DAZ Studio. Not as a helper that writes scripts you then have to paste and run yourself. Not as a chatbot that gives you instructions to follow manually. As a director. You describe what you want. It happens.
Here's what that looks like in practice.
You open a conversation pane (just another tab pane just like any other in DAZ Studio) and start talking about a scene. You want a portrait setup — a female character, dramatic and elegant, in a minimal interior. The AI talks through the possibilities with you: what kind of lighting would suit the mood, what camera angles would work, what the scene needs. Then, when you're ready, it starts building.
It searches your DAZ asset library — your actual installed content — finds a suitable character and environment, loads them into the scene, and positions everything. Then you say: "Create a cinematic close-up camera for her face, slightly low angle." The camera appears, framed exactly as described. You say: "Add a three-point lighting setup — soft portrait lights, warm key, cool fill." The lights are created, positioned, and configured. You say: "Now give me a camera animation that starts far back in the darkness, rushes in from behind her, and sweeps around to show her face." The keyframes are set. Press play.
None of that required opening a menu.
But here's what's equally important: at any point in that process, you can put down the conversation and pick up the mouse. Drag a light to a slightly different position. Tweak a morph slider by hand. Adjust the camera framing by eye. DAZ Director doesn't take over your workspace — it works alongside it. The dials and sliders you already know are still there, still yours, still often the fastest way to get exactly what you want.
So right now the basics can be demonstrated. The plugin can take a command do some pretty interesting things -- those things the marketing literature mentions above like finding content, adding it to the scene, moving cameras around, creating lights, all using natural language commands -- but it's not foolproof and it has many limitations that need to be worked.
What I am hoping to get out of this discussion are answers about how people use DAZ and how something like this could help them use it better, more enjoyably.
Some questions that immediately come to mind include:
- Is something like this really useful for DAZ Studio users? I'm doing this a proof of concept, but the concept keeps expanding.
- What kinds of natural language command would you love to see? Full scene descriptions? Help posing characters? Expressions? Animations? What parts of your DAZ workflow do you find the most tedious?
- Keeping in mind that this doesn't come for free. Even if the plugin was low-cost or free, there is still the matter of paying for the AI charges. It would involve purchasing API credits from someone like Anthropic (Claude) or OpenAI (ChatGPT) or Google (Gemini) or downloading and using your own models -- with the understanding that that is more technically involved and works better when you have local resources to handle it (like a GPU with more 12GB+ VRAM for example). Is that tradeoff or cost something that you would consider?
Anyway, I'm in the process of putting together a demo video of the basic capabilities, and a more detailed post about the technical details, in other forums. But I thought I'd put this out here to foster discussion.

Comments
Interesting, I have had a few projects on the back burner including a Pose Executor that uses phrases to pose characters in a scene even for animation.
So did this incorporate an LLM or straight up NLP or something else? Honestly I've left posing as a special case that requires very specific attention. One of the hardest things to get right.
Very interested in this PromptPose tool - does it work with non-Genesis characters or can it at least work with any character with the properly named bones your tool expects?
With regards to the "how AI can help or expand current DAZ workflows"... definitely would help if we could prompt for a pose and/or animation. More tighter AI mocap or openpose integration would be amazing. However, I forsee quite a bit of friction as it may cause many assets in the DAZ store to be unsellable, like pose/animation assets. Maybe this could be part of the Premiere package and royalties can be paid to the pose/animation creators if it was used to train some AI model for DAZ.
If you've seen Cascadeur's inbetweening tool: https://cascadeur.com/help/category/278. Coupled with the pose/animation AI assisted editing tools and auto-physics, it is amazing what they have built there. I feel DAZ could use a tighter integration with this tool or similar AI assist, since they advertise being able to do animations but its own tools have not seen much significant improvement in years, other than the small changes to the timeline.
A few key areas where it could really help is setting up a scene, if it understands placement etc. If it can see a camera view, have it set up composition based typical composition commands. Lighting can also be time consuming, so it would be great if it could block out some basic lighting. And then finally help with the histogram - how to fix lighting problems - especially when the hdri is blown out but your scene is dark.
So can you think of an example composition instruction that would be useful? Seeing through the camera view, placement, and blocking out lights I've seen it do, but as with anything AI the instructions have to be precise. Composition would seem to be something a bit more subjective, but maybe not?
Actually I do have another project that creates a light rig for a selected character. I made it for a car render animation i was working on. It is not completed and the HDRI section is not showing in this version.
That part was a bit more involved, whereas to find the brightest spot in any given HDRI and add a ghostlight as your source.
Changing the environment light meant that the system had to analyze the new light for the correct relocation of the source light.
This one is geared to showrooms so it was easier to move forward. It also creats a director camera where the lights are linked.
Key Light, Rim Light, Fill Light Top wash light are all created in a single click.
This currently works with any character in your library. The beta focus is Daz G1-G9 characters since most users have a ton of them along with their poses.
You can even ask for a supported character by name and Gx (the selected figure) will morph into a supported character. This can happen on frame 0 or over time if you want transformation. By specifying for example "Bishop over 30 frames" for a transformation from G9 to Bishop the G9 character will morph into Bishop. You can concat the request to then have Bishop transform to Desmond over 30 more frames with a gap of holding bishop for x amount of frames. G9 -> Bishop, wait x time -> Desmond.
There is a playback option to review your transformation so you can adjust it at any given time. Stop a transform in the middle and ask for different character target and it will do so.
@sidcarton1587 please check your PM. A few questions about addition to your plugin.
I replied back to you with some details, but I thought I would repost here for people who are also interested.
I've been adding features and improvements to the DAZ Script Server plugin that might be of interest. You can find the plugin code at https://github.com/bluemoonfoundry/daz-script-server. The README.md file there describes how to build, install, and use the plugin, including what APIs are available.
I am also developing an MCP server that goes with the the script server that is intended to be used to make DAZ Studio a tool that LLMs can integrate into workflows. It's currently very much beta as I'm adding a bunch of upgrades, so it's not exactly stable yet, but in case anyone wants to play with it, it's located at: https://github.com/bluemoonfoundry/daz-mcp-server
Thanks for the update.
Just in case anyone is following, I've added substantial updates to the Script Server and the MCP Server.
See https://github.com/bluemoonfoundry/daz-script-server and https://github.com/bluemoonfoundry/daz-mcp-server
Main updates are tighter integration between the two to work together, async requests to support long-running scripts like batch renders, and probably most important 70 tools/skills/resources exposed through the MCP server that can be leveraged by LLMs. The next big update will expose the DAZ content library to natural language search, which will also be exposed as a tool.
Wow quite an improvement, will update shortly.