Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
@V3Digitimes:
Hi, I just found something you should not do with Scene Optimizer (This is on the macOS version of DS, trying to get it confirmed on Windows as well). The reason I did what I did is that I had setup a scene using Public Beta, and used Scene Optimizer to scale down textures of a "miniature" of a large set.
I wanted to render using Boost4Daz, but currently Iray server does not support the latest Iray in Public Beta, so I opened the scene in the Release version, and DS crashes hard when it cannot find the scaled down images.
So, I could have solved it by moving the V3D image folder from the Public Beta Temp Directory to the Release Temp directory.
I guess the bug is in DS, not in your product, but I thought a fair warning about a possible "Opps!" was due.
Cheers,
/ Totte aka Code 66
Hi @Totte, thanks so much for letting me know!! Yes, I agree with you, the crash issue probably comes from Daz Studio itself, since the low scale maps created have nothing particular (I mean in comparison with any other map you would create yourself and load in a surface property slot). Maybe there were too many maps DS could not find (plus could not find an associated product) and it resulted in a crash... Or maybe it's linked to a specific way to handle temp folder between different versions.. Well thanks for letting us know for the solution here, this way I'll be able to answer people who might have such an issue.
On my side I use copies of maps directly in their original content runtime folder (since I always use DIM), so I never met this issue (plus all my DS versions share the same content folders), that's why that's so nice from you to share this info here!
Have a great day!
@V3Digitimes
Thanks for creating this product!

One question - I have started using PNG files to preserve quality without having to resort to bigger maps. Is it possible for Scene Optimizer to save the output as PNG files? I didn't see the option, and when I've run it I see PNG images get converted to JPG images.
Edit: Upon a closer look after running out of VRAM on a big scene, I see it doesn't resize the PNG images at all. Is it possible to add this?
Hi, thanks for the nice feedback.
-> If your png has no transparency channel (zero transparency), the resolution reduction should be done automatically (I don't think it depends on options, or my configuration has the good options and I'll be able to give them to you uppond request).
-> For other png (with alpha transparency), you have an option as a checkbox in the first tab: "Also reduce LIE maps / Also replace on surfaces png files with alpha transparency....". It must be checked if you want the images to be processed (I think it is unchecked by default)
This will replace your (for instance) 4096x4096 png files by (for instance) 1024x1024 png files. The HUGE drawback is that it is not able to keep the "transparency" channel (the transparent pixels of the png file will be set to black). This is why processing such maps was an added as option when Genesis (2? 3? 8?) eyelashes were almost all made using png with transparency, which caused dramatic issues.
I just re-tested on a non transparent png (all pixels are 100% visible), it changes the resolution as expected wether I check this box or not. For png with transparency, only if you check this box, the image is converted to a new png (lower rez) using 'black' where the opacity was 0 (this cannot be chosen). Sometimes it's super annoying, like for eyebrows, and sometimes not at all (like for the borders of a tree leaf, around the leaf pattern which are not visible in theory due to an additional opacity map).
Let me know if it solved your issue.
You're welcome @V3Digitimes
Anyone who has thought about this, it's a huge time saver if you're working with large scenes and can't hide things out of the camera's line of sight due to mirrors or other reflective surfaces.
That worked. I would have posted earlier but closed the tab and lost the thread. One interesting thing I noticed is that for some reason it appears that the Metallic Weight Flakes maps aren't reduced. Not a big deal, just something I noticed with all my spot checking.
Thanks a lot for your feedback! That's very nice from you to take the time to post here :)
That's curious the metallic flakes maps were not reduced. Maybe they were pngs with transparency and the box to process pngs with transparency was not set up properly.... Because otherwise the optimizer process is not supposed to care about which property is using the map, or which shader is used... The only restriction is the type of map used (and if png use transparency or not).
edit : I'll check if another reason could cause this
@V3Digitimes
I bought scene optimizer ages ago, but only just started using it. It saves me a lot of time when rendering the viewport using iray, ut doesn't seem to impact my full render times (Ctrl-R). I am guessing I am missing a setting somewhere, but con't find any information about it in the documentation. Can you please advise?
One other thing. I reduced my scene maps by a factor of 2. Saved as a new scene, opened it, then ran scene optimizer again, but the maps sizes are showing as still the same as the originals. I would have expected half the original resolution. What am I expected to see here?
Thanks!
Scene Optimizer, if your scene originally fitted in your GPU (video card), will allow you something between 5 and 25% of rendering speed up (depends a lot on the scene). This is when the scene is too big to fit in your video card, that the optimization really makes sense, since you can gain a factor 10 in rendering speed. Said differently, the bigger the scene is, the highest impact you can have (for instance if you have a scene being a plane with a 4k texture, you won't gain time by optmization, but if you have 3 figures dressed, clothed, "haired", in an heavy environment, you will probably make them fit in your video board - where they did not fit - with compromises and choices you can make via the optimizer, and gain a huge amount of time).
Furthermore, remember that render time depends not only on the cumulated texture size, but also on the number and size of meshes and nodes, as well as the some instances rendering properties when instances are used. If the texture size has a minor impact whereas the size of meshes have a huge impact, then reducing the number or nodes and their resolution will be more efficient than reducing the textures alone. You have the distance to the camera in the optimizer interface, allowing you to select (via the checkboxes) the farthest nodes to more drastically reduce their textures size, or remove some of their maps (a bump or normal map 20 meters away may not be necessary), or to stronly lower their resolution (no need for high rez object if they are small and/or distant). In order to make sure that the old maps are "cleared from Daz memory", I generally restart Daz before a rendering (but it is anyway in my workflow, wheter I "optimize" or not).
After an optimization, it happens to see the same size in the "biggest map size" because I use the "biggest". Indeed, some of the maps (for instance the makeup maps, or any map with an alpha transparency by default - with checkbox, or some tiff normal maps) are not processed (because the transparent alpha chanel become black, or there is a too big loss in quality). This is why, whereas 90 % of the maps went down from 4096*4096 to 2048*2048 or 1024*1024, 10% of them remain 4096x4096, and this is why when you relaunch the script, you see that the biggest map is 4096x4096 (even most of the textures where reduced). This is an information I should work on, but I'm not satisfied by any idea I have (should I add a column "smallest map size" or "average map size"?).
Normally you can check very fast that you work on reduced images : select a "reduced" surface with the Surface Selection Tool and go in the Surfaces/ Pane (Editor SubPane). In the "Base Color", look at the name of the map by keeping your mouse on the slot that holds the image minitature, or at the list of the maps in the scene by left clicking on this slot. You should see that the reduced images now have the original image names with their new size at the end (for instance "myimage.jpg" becomes "myimage 1024x1024.jpg"). This way you will know if the images where reduced or not.
I will really think about how to make this information more clear for the user (even I am sometimes annoyed because of this "biggest" image issue when some image where not reduced (which is the normal optional behavior for some formats or specificities of images).
Still hoping for an update with columns for Normal and Bump Mats on the Materials\Visibility tab of the Scene Optimzer script, so one can see which asset(s) have Normal & Bump Mats. A column for Displacement Maps would be great too.
It's still in my todo list, but sadly it's still at the bottom (todo list organized by urgent/not urgent, important/not important). Remind me why you need those informations? (because if you want to remove normal map or bump maps, you don't need to know if an object has some, normally you simply ask for removal, and the ones having such maps are processed).
Because there might be an asset with normal\bump mats in the foreground which would render better with them, but objects further back don't. The same for displacement maps.
I see your point, you don't want to remove the "normals" of the elements close to the camera, but this is why you have a "distance" column allowing you to uncheck the closest ones. Plus, you have the restore options whenever you remove them by accident. Why not using these tools? It would not change the workflow, no?
Hello , I'm trying to use the product "Scene Optimizer".
Although I set the folder of low resolution maps.
But after I executed this script, the folder I set up was still empty.
Correspondingly, the compressed images appeared in the runtime folders of each product.
This bothers me a lot. They have increased the space occupied by the folders of my various products.
Moreover, the folder I set up for storing compressed pictures seems to have become meaningless.
I'm using the current latest version of "Scene Optimizer".
I never used that feature, I usually just have the scene I use in full resolution and then safe a copy of that and only that one gets the treatment by the scene optimiser (as someone who reuses a scene several times often with little changes, this prooved to be the best working solution)
@kyoreona : Hi, have you checked the box : "Also use this folder for Maps of other installation methods (DIM, Manual)?"
Because the behavior you describe is the behavior you have when this is not checked. This checkbox is just below where you set up the folder. Basically the folder you set was made for "Smart Connect" installed products, because for these products, you cannot write in the runtime, but if you install your products with DIM or manually and you want to make SURE that the folder you specify is used, then you must check this checkbox. Let me know if your problem is not solved by doing this, I'll be here to help. This was the easiest way to let people choose between runtime and specified folder easily when installed with DIM/Manually, I'm sorry if it is not clear.
How can I revert scene optimizer changes without the text files?
I've received some scenes from a friend, and the optimized textures are missing. Obviously the text files are not available either. Is there a way to revert the changed textures to the originals?
Hi, what you're trying to do isn't straightforward if the text files weren’t saved at the time of optimization. However, there are a few possible ways to recover the original textures:
If your friend still has the original scene (before using Scene Optimizer), they can generate the text files retroactively. You could then use the restore scripts on your end, after editing the text files to match your file paths (e.g., changing the drive letter or the base directory). You’d likely need a mass text editor for this (like Notepad++), and I’d be happy to help if you choose this route.
If your friend saved the maps in the original runtime (or in a custom runtime), they can zip and send the relevant subfolders. Since the paths in Daz scenes are relative when using runtimes, you should be able to use them just by placing them in one of your own runtimes.
If the optimized maps were saved in a specific folder outside of any runtime, your friend can zip that folder and send it to you. In that case, you’ll need to edit the scene file to adjust the file paths from their setup to yours. For that, it’s easier if the scene is saved in uncompressed .duf format (they may need to re-save it that way). You can use Notepad++ for the edits, and again, I can assist if needed.
For characters and wearables, many of them have Smart Content links to their original material presets. You can try using Smart Content to reapply the materials — and if you hold the Control key while applying, you should get options to load only the maps. For props or subsets, it’s less obvious, but you can try reloading the item, copying the parameters from the old instance (Ctrl+C in the Parameters pane) and pasting them into the new one (Ctrl+V). Make sure the Parameters tab is active — I’m not 100% certain, but that’s usually needed.
Let me know if any of these options help, or if you'd like to dive deeper into one of them. Happy to assist further!
I'd watch out for licensing violations if asking a friend to send you texture maps (even modified ones) from commercial products.
My friend is quitting daz and most likely deleted everything by now. But I'll give editing the scene files a go.
Maybe scene optimizer can benefit from a script that looks for ****x**** and ***x*** at the end of file paths, and trims them ti try to re-link to the originals. Since that's what scene optimizer usually adds.
I've seen paths that have been optimized multiple times too.
From what I can see, scene optimizer shows products with missing textures in the list, I assume it has access to the current path. It doesn't have to do a full search, just checking all libraries for a trimmed address should suffice in most cases, for example:
/Runtime/Textures/DAZ/Characters/Genesis8/Michael8/Michael8_Arms_NM_1004 2048x2048.tif
check if a:
/Runtime/Textures/DAZ/Characters/Genesis8/Michael8/Michael8_Arms_NM_1004.tif
exists in any of the libraries.
It doesn't have to solve all cases either, just reduce the numbers so we can handle the remaining manually.
Speaking of... I hate the interface that daz presents for missing files, having to click an error message for every address, locating a file, and in most cases if the file doesn't exactly match, the error repeats. While you're at it, maybe you can code a more elegant interface for missing files too, like the one that scene optimizer has.
Yes it shoud work as long as the lower resolutions images where saved directly in the original runtime folder, or if they were saved with the option to preserve the runtime subfolder hierarchy checked. It is only problematic if you are out of both those configurations (as I am in general).
I did try editing the scenes with notepad, and it does fix the missing textures, I think most users won't bother with changing the directories. Although, it might be a good idea to suggest in the scene optimizer, setting up an additional content library folder, to hold the changed textures, while keeping their addresses intact.
I can have a look at the best way to let people know the procedure, or apply it clearly as a choice. This is an interesting feedback, thanks.
Hi everyone,
I’ve finished an updated build of Scene Optimizer, which includes several small improvements and ensures full compatibility with Daz Studio 2025, while keeping support for DS4.
This build introduces Normal Map detection and counting (DS6 only, user request, including detailed normal maps), faster performance across all DS versions, and a few workflow refinements — such as the new Hi/Low Resolution switch buttons and the Open Documentation shortcut.
After recent discussions with Daz Team, it’s still better to wait until the Daz Studio 6 scripting API is more stable before any official submission, but I’d like one or two experienced users (whether you already own the product or not — test it and keep it!) to try this version and confirm that everything works as expected under normal use — toolbar, texture counting, resolution switches, folder change, etc.
If you’re interested, please reply here or send me a PM.
I’ll provide the package and short notes about what changed — nothing complicated, just a quick validation pass that helps me prepare more reliable next builds.
Thanks a lot for your help!