Proposals to Help DAZ Studio Thrive Again in the Generative AI Era

  1. Redesign the Interface from the Ground Up Modernize the UI to keep up with the times. When launching the program, include a prompt field where users can input step-by-step preferences—such as desired image style, lighting level, artistic tone, etc.

  2. Introduce a “Guide Me” Button When clicked, the system should generate a rough preview and recommend which product categories to use and how to set up the scene. (This wouldn’t undermine DAZ’s existing business model—in fact, it could significantly boost product purchases.)

  3. Adapt to the Generative AI Landscape Accept that some legacy products may become outdated. For example, pose and animation packs should be integrated into the core engine, allowing users to creatively direct motion themselves.

  4. Develop a Realistic Rendering Engine Create a rendering engine capable of producing final outputs with realism on par with today’s generative AI standards. (You could monetize this by offering lighting packs that affect the realism level.)

I don’t want to see DAZ Studio disappear from this world. I’ve invested so much into it—so much that it’s probably too much. I don’t want to feel the heartbreak of losing a piece of software I love, and I don’t want to regret the large amount of money I’ve poured into it. That’s why I’m making these proposals.

Comments

  • I think the tooling is where the power of AI lies, not generating images, with respect to applications like DAZ Studio. Increasingly, AI applications are ecosystems of workers rather than a single application. It's no longer about simply prompting to have the AI create some product for you. Rather, it's about making the data and tooling work with AI to improve your workflow. 

    The missing piece, in my opinion, is interfaces to one's own data and the Studio application that would make it possible to integrate both into an AI workflow. 

    Example 1: 

     I have a large product content library. I don't remember the details of probably 90% of it, and without an effective way to search it without those details, I end up using the same 10% that I can remember. I need help me search it more effectively. This is something that AI can actually do. Rather than search for keywords and hope it matches something in your product library, providing access to an AI to your product library and asking it to do a plain English (or whatever language) search can be much more effective. The missing piece? DAZ does not expose its content database in a way that is easy to integrate into an AI workflow. It's possible to do, I have written such a tool actually, but it involves fighting against DAZ rather than working with it to get the job done. 

    Example 2:

     Provide enough information about model structure to create animation poses. The information is there in the model. There are examples of how to ajdust poses. There are examples of keyframe generation and camera animation. Once you give AI examples, it becomes much more powerful in being able to extrapolate new content (not in a creative sense, but re-creating based on its data). The missing piece? Studio is not set up really to expose tools like posing, shaping, lighting, etc. for machine-to-machine operations. You sort of can do it through DAZ scripts. I've done this also, but it takes a fair bit of convoluted tinkering to get there. The external APIs are not clean and don't give you full access to the componentry that is necessary. 

    My point is that DAZ is really not suited for the kind of automation and tool instrumenting that would make it a first-class citizen in an AI-assissted workflow. It's possible to do it, but it takes a bit of tinkering, finding undocumented APIs or data, and so forth. It's doable, but DAZ does not make it easy or straightforward. The next generation of AI, the agentic workflows, are built around having non-AI compontents exposed through standard interfaces for AI agents to do their work. DAZ is a user-facing application, not naturally suited to the machine-to-machine types interactions that would be necessary. 

     

     

  • iSeeThisiSeeThis Posts: 559

    sidcarton1587 said:

    I think the tooling is where the power of AI lies, not generating images, with respect to applications like DAZ Studio. Increasingly, AI applications are ecosystems of workers rather than a single application. It's no longer about simply prompting to have the AI create some product for you. Rather, it's about making the data and tooling work with AI to improve your workflow. 

    The missing piece, in my opinion, is interfaces to one's own data and the Studio application that would make it possible to integrate both into an AI workflow. 

    Example 1: 

     I have a large product content library. I don't remember the details of probably 90% of it, and without an effective way to search it without those details, I end up using the same 10% that I can remember. I need help me search it more effectively. This is something that AI can actually do. Rather than search for keywords and hope it matches something in your product library, providing access to an AI to your product library and asking it to do a plain English (or whatever language) search can be much more effective. The missing piece? DAZ does not expose its content database in a way that is easy to integrate into an AI workflow. It's possible to do, I have written such a tool actually, but it involves fighting against DAZ rather than working with it to get the job done. 

    Example 2:

     Provide enough information about model structure to create animation poses. The information is there in the model. There are examples of how to ajdust poses. There are examples of keyframe generation and camera animation. Once you give AI examples, it becomes much more powerful in being able to extrapolate new content (not in a creative sense, but re-creating based on its data). The missing piece? Studio is not set up really to expose tools like posing, shaping, lighting, etc. for machine-to-machine operations. You sort of can do it through DAZ scripts. I've done this also, but it takes a fair bit of convoluted tinkering to get there. The external APIs are not clean and don't give you full access to the componentry that is necessary. 

    My point is that DAZ is really not suited for the kind of automation and tool instrumenting that would make it a first-class citizen in an AI-assissted workflow. It's possible to do it, but it takes a bit of tinkering, finding undocumented APIs or data, and so forth. It's doable, but DAZ does not make it easy or straightforward. The next generation of AI, the agentic workflows, are built around having non-AI compontents exposed through standard interfaces for AI agents to do their work. DAZ is a user-facing application, not naturally suited to the machine-to-machine types interactions that would be necessary. 

    Here’s an English translation and adaptation integrating a professional and insightful style based on your intended praise and further commentary:


    Your analysis is spot on! I truly appreciate how you dig into the structural challenges in software like DAZ Studio, which most people in the AI creative space tend to overlook. Your point about AI’s value shifting from simple “image generation” to enabling powerful ecosystems of interoperable tools is extremely timely.

    Your real-world pain points and the way you’ve tried to solve them—like building your own tools and scripts for DAZ—really show your depth as a power user and innovator. Especially in describing issues like searching your vast content library or automating animation tasks: these concrete examples make it clear how much more creative freedom we could unlock if platforms were designed to interface organically with AI.

    If creative platforms embraced open APIs or seamless integration points for AI agents, we’d see a massive leap in both productivity and creative scope. It’s not just about “prompt in, image out” anymore—the real game-changer is giving AI robust access to our data and creative tools, letting it collaborate with us as part of the workflow.

    I also admire your willingness to tinker, reverse-engineer and push beyond the platform’s limitations rather than waiting for a ready-made solution. That’s the real “maker” spirit needed to drive the next era of AI-augmented creativity. I hope platforms like DAZ move toward supporting agentic workflows with fully open interfaces, so creators like you don’t have to fight the system to get things done.

    Your insights are truly inspiring—thanks for sharing such concrete and forward-looking perspectives!

     

  • wolf359wolf359 Posts: 3,928
    edited November 9

    AI applications are ecosystems of workers rather than a single application. It's no longer about simply prompting to have the AI create some product for you. Rather, it's about making the data and tooling work with AI to improve your workflow. 

     

    One of the ways that Daz (and its PA’s) should be utilizing AI is in Advertising their store content with AI animations of the actual products.
    Here is a proof of concept I did recently


    Post edited by wolf359 on
  • iSeeThisiSeeThis Posts: 559

    very nice demo, wolf359!

  • sidcarton1587sidcarton1587 Posts: 54
    edited November 10

    It also doesn't help that it really looks like DAZ is not sure what direction it wants to go with AI. It has tried a few things, tinkered, but there isn't a clear direction. 

    I recently did some experiments with AI generation of DAZ scripts. Generating scripts through AI is error prone and hasn't done well, as other people have reported. LLMs routinely hallucinate methods and concepts, making the generated scripts not useful. There are two things that LLMs require to generate more quality scripts:

    1. The actual Studio API specification in a format that LLMs can understand

    2. Solid examples of quality working scripts that the LLM can work from 

    It took just a little bit of effort to get #1, and there are plenty of examples for #2. Using at least #1, I am able to prompt LLMs to generate useful scripts. Heck, I even used AI to create a web app that lets me prompt for what I want a script to do, and it uses least #1 to generate scripts that 99.9% correct. With a little bit of context information, I can probably get that to 100%. 

    The catch?

    1. DAZ doesn't release the specification itself, so I have to rely on extracting it myself from the public facing API documentation pages. 

    2. There are legal issues with using the examples and samples, so I won't use them in my experiments or tools

    I've had two questions out to DAZ for about a month requesting clarity on what is and isn't legally permissible. I have had no replies from DAZ. Until I get clarity, I'm not going to release the work to the community. My sense is DAZ is still trying to figure out how its going to respond to AI generally, but so far it's responses have focused more on model training and content generation (like it's AI Training Data partnership offer), not the broader utility of AI beyond those areas. 

    Post edited by sidcarton1587 on
  • iSeeThisiSeeThis Posts: 559

    My sense is DAZ is still trying to figure out how its going to respond to AI generally, but so far it's responses have focused more on model training and content generation (like it's AI Training Data partnership offer), not the broader utility of AI beyond those areas. 

    They had visionary in 3D model. But I guess that was their comfort zone. a human doesn't respond well in the new uncomfortable zone. If it were me and I were rich enough, I may choose to take a vacation and come back with totally new company and direction.

  • iSeeThis said:

    My sense is DAZ is still trying to figure out how its going to respond to AI generally, but so far it's responses have focused more on model training and content generation (like it's AI Training Data partnership offer), not the broader utility of AI beyond those areas. 

    They had visionary in 3D model. But I guess that was their comfort zone. a human doesn't respond well in the new uncomfortable zone. If it were me and I were rich enough, I may choose to take a vacation and come back with totally new company and direction.

    To be fair, it's something that a lot of companies are struggling with. The initial thrust of AI was in content generation, and that made sense. It was something that the average non-technical person could understand and wrap their mind around. The current wave of interest is focused on how to take business process workflows and augment them with AI to make the workflows execute more efficiently, both in terms of quality and cost. 

    What we are talking about now goes way beyond content generation. You are now talking about coupling AI with tools, resources, and data to execute a specific set of workflows. The canonical example is "AI can help you not only define your vacation, but schedule it, and book the tickets". This is where the disconnect happens. A tool like DAZ Studio, for example, isn't natively set up to work that way. It's not set up to be a cog in a larger workflow. 

     

     

     

     

  • iSeeThisiSeeThis Posts: 559
    edited November 15

    A tool like DAZ Studio, for example, isn't natively set up to work that way. It's not set up to be a cog in a larger workflow. 

    Agree. And hope they'll have a big solution soon cuz if not soon enough they'll be gone.

     

     

     

    Post edited by iSeeThis on
Sign In or Register to comment.