LOD System

https://www.daz3d.com/daz-lod-system-and-cleaner (I couldn't find anything in the PA products forum.)

So, is this a competing product to Scene Optimizer? It doesn't seem like one would need to use both. It says it's written in Python, I assume that means it is a plugin?

Loading the images into GIMP and doing a difference on the layers does reveal the images are different. But those differences seem small.

«1

Comments

  • WendyLuvsCatzWendyLuvsCatz Posts: 40,437

    I think it works like mip mapping 

    replacing textures on objects further away from the camera with progressively lower resolution maps

    most game engines do this but they use dds files which have lots of progressively smaller layers

  • AinmAinm Posts: 764
    edited February 11

    WendyLuvsCatz said:

    I think it works like mip mapping 

    replacing textures on objects further away from the camera with progressively lower resolution maps

    most game engines do this but they use dds files which have lots of progressively smaller layers

    LOD textures also increase VRAM use because they hold more information. They speed up rendering time, which is why they're used in games that need to give the illusion of a realtime world. As I understand the description of this product, it will create a new version of a texture for items that are far away, but keep original textures for items that use the same textures closer to the camera. Definitely a LOD (level of detail) system and it should speed up rendering time, but if you have a scene with an object repeated at different distances from the camera, it could also increase VRAM requirements on those. Seems a useful utility if it works how I imagine it might, but VRAM saving will be dependent on your scene. I wonder how - or if - it handles instancing.

    I've seen this product before somewhere. I wonder if I purchased it elsewhere already or if I remember it from a forum discussion somewhere. Anyone else recognise it? Don't want to buy it twice if I can help it, but can't for the life of me remember where I've seen it to check whether I have it already.

    Post edited by Ainm on
  • ElorElor Posts: 3,444

    Ainm said:

    I've seen this product before somewhere. I wonder if I purchased it elsewhere already or if I remember it from a forum discussion somewhere. Anyone else recognise it? Don't want to buy it twice if I can help it, but can't for the life of me remember where I've seen it to check whether I have it already.

    It was sold on Renderhub.

  • Elor said:

    Ainm said:

    I've seen this product before somewhere. I wonder if I purchased it elsewhere already or if I remember it from a forum discussion somewhere. Anyone else recognise it? Don't want to buy it twice if I can help it, but can't for the life of me remember where I've seen it to check whether I have it already.

    It was sold on Renderhub.

    That's helpful. The dev posted on Renderhub's forums that he thinks it completely replaces https://www.daz3d.com/scene-optimizer and goes beyond it.

     

  • ArtiniArtini Posts: 10,455

    There is a video on YouTube about it:
    Daz LOD System (Better Scene Optimizer)

     

  • felisfelis Posts: 6,048

    I feel that they don'y really describe how their system works.

    In their promos they are mentioning 21 GB original textures compared to 1GB compressed textures. But Iray don't use compressed textures.

    And the image they used was a birds view, and in a renderer that was not Iray. 

    And in the video they are basically mocking scene optimizer for being slow.

    It is likely that Python is way faster that Daz Script, but personally I lack to understand how they do it.

  • felis said:

    I feel that they don'y really describe how their system works.

    In their promos they are mentioning 21 GB original textures compared to 1GB compressed textures. But Iray don't use compressed textures.

    And the image they used was a birds view, and in a renderer that was not Iray. 

    And in the video they are basically mocking scene optimizer for being slow.

    It is likely that Python is way faster that Daz Script, but personally I lack to understand how they do it.

    Presumably they rescale the texture and replace it, not "compress", just like scene optimization does. The difference is they do it "faster" somehow. How can you tell it wasn't Iray?

  • felisfelis Posts: 6,048

    This image, if it is Iray it look like a toonish one. Very homogen colors.

  • memcneil70memcneil70 Posts: 5,515

    That was my first thought or from a game program.

  • TotteTotte Posts: 14,843

    jmucchiello said:

    https://www.daz3d.com/daz-lod-system-and-cleaner (I couldn't find anything in the PA products forum.)

    So, is this a competing product to Scene Optimizer? It doesn't seem like one would need to use both. It says it's written in Python, I assume that means it is a plugin?

    Loading the images into GIMP and doing a difference on the layers does reveal the images are different. But those differences seem small.

    Not a plugin ( plugins are written in c++ ).
    It says DS script and python, I've done that too ( but not for anything in the store ).
    You write an backend in python and call it via DzProcess, the tricky part is to make the python code signed and verified to run, specially on macOS which has a quite highter level of code protection these days.

    But there are some python IDEs that does that pretty good these days,

     

  • TugpsxTugpsx Posts: 819
    edited February 11

    Totte said:

    jmucchiello said:

    https://www.daz3d.com/daz-lod-system-and-cleaner (I couldn't find anything in the PA products forum.)

    So, is this a competing product to Scene Optimizer? It doesn't seem like one would need to use both. It says it's written in Python, I assume that means it is a plugin?

    Loading the images into GIMP and doing a difference on the layers does reveal the images are different. But those differences seem small.

    Not a plugin ( plugins are written in c++ ).
    It says DS script and python, I've done that too ( but not for anything in the store ).
    You write an backend in python and call it via DzProcess, the tricky part is to make the python code signed and verified to run, specially on macOS which has a quite highter level of code protection these days.

    But there are some python IDEs that does that pretty good these days,

     

    As Totte pointed out, the Daz to Python process has been around for some time, My old Daz to Marvelous Designer script that was released for FREE contained an interface codeing to allow DS Scripts to launch Python codes.

    This texture mapping replacement process could be useful for some, but as others have pointed out, if you already have a tool that can perform the task; you will need to confirm that the scripts will work in studio 2025/6 else you will have future problems. Will be curious to see what others think.

    Post edited by Tugpsx on
  • I had no idea calling Python from the scripting language was possible (and gaining access to the scene data).

  • WendyLuvsCatzWendyLuvsCatz Posts: 40,437

    jmucchiello said:

    I had no idea calling Python from the scripting language was possible (and gaining access to the scene data).

     Philemo's Virtual World Dynamics bridge does

    VWD is a Poser py script 

    (not that it works now since quite a few Windows and D|S updates)

    and being a code ignoramus, I have no solutions either crying

  • ElorElor Posts: 3,444

    jmucchiello said:

    (and gaining access to the scene data).

    Just open any non compressed .duf file and you'll see that the data inside are in plain text, using JSON to structure them: any tool capable of dealing with JSON will be able to deal with that data flawlessly.

    And if it's compressed, DS uses the good old gzip format, which is easy to work with.

  • WolfwoodWolfwood Posts: 878

    Competing products are nothing new here. But in general i see PAs being respectful to one another and avoid direct conflict.

    Directly mention another a PA product in a diminishing way in your very main promo image is not being a good sport.

    They should concentrate in providing proper real examples and not a simple bird eye view of a city; that is extreamly narrow view (pun intended) in terms of Daz use cases that could benefit of VRAM optimization.

  • vrba79vrba79 Posts: 1,515
    I'll stick with Scene Optimizer. It has been, and continues to be, very good to me.
  • WendyLuvsCatzWendyLuvsCatz Posts: 40,437
    edited February 11

    Elor said:

    jmucchiello said:

    (and gaining access to the scene data).

    Just open any non compressed .duf file and you'll see that the data inside are in plain text, using JSON to structure them: any tool capable of dealing with JSON will be able to deal with that data flawlessly.

    And if it's compressed, DS uses the good old gzip format, which is easy to work with.

    I think a dse is encrypted and that is sadly what the VWD bridge uses 

    7zip cannot extract it like duf files

    would ask Philemo but he vanished after the Covid Pandemic and fear the worst

    actually see he last visited the forum 2025 after several years absence mmm

    Post edited by WendyLuvsCatz on
  • jmucchiellojmucchiello Posts: 831
    edited February 11

    Elor said:

    jmucchiello said:

    (and gaining access to the scene data).

    Just open any non compressed .duf file and you'll see that the data inside are in plain text, using JSON to structure them: any tool capable of dealing with JSON will be able to deal with that data flawlessly.

    And if it's compressed, DS uses the good old gzip format, which is easy to work with.

    I was thinking it accesses the live data in the running instance. Of course you can read the files. (I'm constantly editing the actual files because of missing properties in store assets due to typos, for example)

     

    Post edited by jmucchiello on
  • ArtiniArtini Posts: 10,455

    So, anybody has bought it? If so, please post your results.

     

     

  • TotteTotte Posts: 14,843

    jmucchiello said:

    Elor said:

    jmucchiello said:

    (and gaining access to the scene data).

    Just open any non compressed .duf file and you'll see that the data inside are in plain text, using JSON to structure them: any tool capable of dealing with JSON will be able to deal with that data flawlessly.

    And if it's compressed, DS uses the good old gzip format, which is easy to work with.

    I was thinking it accesses the live data in the running instance. Of course you can read the files. (I'm constantly editing the actual files because of missing properties in store assets due to typos, for example)

     

    Easier to just collect all current map-references, toss them into a file, and hand that file over to the python, with a "distance to camera", and do the rescale, then hand a list back, and update.

  • kprkpr Posts: 346

    Anything that reduces texture-usage is potentially welcome, so also interested in seeing some reviews - especially given the lack of a PA page in the forums (which is often not a "good" sign).

    The image/video they are pushing seems to be demonstrating that those tiny buildings would still be 21GB in texture-size Daz native - with so many and so visible that's likely but its far from a typical Daz scene and there are specific assets to do cityscapes with reduced and manageable textures (Stonemason et al). I'd be interested in seeing some (character-full) more typiclly Daz scenes and the results of those.

    One of the (many) benefits of Scene Optimizer is you can pick and choose what textures get reduced (and by what factor) - this doesn't read like that.

    Less of a mystery, more info required.

  • Totte said:

    jmucchiello said:

    Elor said:

    jmucchiello said:

    (and gaining access to the scene data).

    Just open any non compressed .duf file and you'll see that the data inside are in plain text, using JSON to structure them: any tool capable of dealing with JSON will be able to deal with that data flawlessly.

    And if it's compressed, DS uses the good old gzip format, which is easy to work with.

    I was thinking it accesses the live data in the running instance. Of course you can read the files. (I'm constantly editing the actual files because of missing properties in store assets due to typos, for example)

     

    Easier to just collect all current map-references, toss them into a file, and hand that file over to the python, with a "distance to camera", and do the rescale, then hand a list back, and update.

    Ha, @Totte is showing his battle scars and the wisdom he's gained from them.

    I've found that there is no perfect solution because neither approach is documented very well, and you're still going to have to reverse engineer things.

    Using the SDK to get runtime information is not optimal because the SDK lags far, far behind actual DS development and the SDK doesn't employ any kind of component architecture that would allow it to keep up with development; its just some header files and DLLs. C++ doesn't do introspection, so the SDK doesn't even know about the latet, greatest DAZ innovations.

    And reading JSON is not optimal either because, while it has all the data and JSON is to some degree self-documenting, its a snapshot of the scene in a primordial state, before DAZ Studio has pulled it into the scene, i.e. sometimes one does not know how DS transofrms that data to arrive at what you see on the screen. I believe that is why the Diffeomorphic addon requires .dbz files: even a verifiable coding genius like Thomas Larsson couldn't reverse engineer how autofit starts with unmorphed vertex data from the JSON and ends up with a jacket on a G8 at runtime.

    Every day of my waking life I dream about how awesome it would be if DAZ would just help a little bit more the enterprising nerds that are making DS better, for free. If there were a Developer Program like there is for Maya, I would have paid a pretty penny for it in a split second if it would have cut down significantly my more than 6 years stumbling around in the dark, trying to get Sagan to work.

     

  • It used to be sold on RenderHub, here is the anwer by the author that was given 2 years ago:

    Yes, it's a one-click solution. When you press the Update LODs button, it starts to collect objects data. It calculates size and distance from active camera to object surface bounding boxes (not to the object bounding box itself, but to each surface, because it makes much more sense). It also gets the render resolution, the texture path and the value (RGB or a single float value). Then it stores everything into data.json and runs the Python code that is a built .exe file. The script weights 60MB because it contains Python and several libraries, so you don't have to install everything by hand.
    On the Python side the following happens. Using regex and removing suffixes, it tries to find original images (zero LOD) from filenames provided and uses that as a starting point. Then it creates links from the images to objects to make sure that only one image LOD will be created and applied. Then the script can hide too small objects, if you need it to, but will ignore objects with emissive surfaces, because we don't want to hide light sources. After that the magic starts to happen. The script starts to create LODs using all the power of your CPU with multhithreading (basically this's why the script can be 50 times faster than V3D Scene Optimizer depending on your CPU speed and cores).
    The process of creating LODs is interesting. It opens each image using cv2, goes through each pixel with numpy, gets minimum and maximum and calculates the average value. The result is a delta factor (0.0-1.0) that tells us about of how many useful data does the image contain. If it's 0, then the image is considered to be solid and will be fully removed from the scene and replaced with its color value or float value multiplied by value that is set in the property. So instead of using a solid image of black color we will have color set to [0, 0, 0]. But there's another case where the script would want to temporarily hide an image from the scene. It happens when the object is too far away and doesn't really contribute. Then the script gets the image's average color, multiplies it by the property value, removes the image and applies the value. In this case the image path is being stored in the object itself. You will find a "Dict Property" property that is basically a string converted from a json dictionary that contains all the hidden textures and original property values. Basically, that's why I consider the script to have non-destructive workflow, as you don't store your data outside the project.
    But how does the script create images? When it comes to saving textures, it uses the Pillow library which is capable of reading/writing icc_profiles and compressing images. The bigger the LOD, the lower the resolution and higher compression is used, because we don't really want to fill our content library with too heavy images. Each LOD is 2 times smaller than the previous LOD. You can also set the minimum texture resolution (256 by default), and then the script will clamp the image size when it's needed, taking into account the image ratio. When the image is created, it stores the data (filenames, values) into the dictionary and compares new data to the old data, making sure that only useful changes will be passed to Daz.
    Everything on the Python side happens within 0.1-5 seconds! The only issue is that Daz slows down everything else.
    When the Python script ends his job, the Daz script reads the data.json and applies all the changes (applies/removes textures, sets/stores values and creates/deletes Dict Properties). When all the work is done, the VRAM calculator starts reading the resolution of each image and displays the VRAM usage. If you have Texture Compression enabled, the actual value can be different.

    So, does the script work well? I'm an AVN developer and I created a huge scene for my project that I tried to optimizer with the V3D Scene Optimizer. I showed it on the video on YouTube and the whole "optimization" process could take me 20 minutes, but I gave up. Basically, that's why I started writing the script for my personal use and then decided to share it with my friends who are also AVN developers. With them we fixed a few bugs and right now I don't see any new bug reports.

     

  • TiZTiZ Posts: 26
    edited February 12

    Okay, so I just bought it, and I've tested it on one of my scenes. Specifically, this one.

    I had already used Scene Optimizer on this scene to reduce the texture size of select items. According to this tool, the scene weighs in at 4554MB. That makes sense; combine that with the around 2GB that Iray requires at baseline, and it fits inside my video card's 8GB of VRAM. But then I ran the LOD operation on the scene, and when it was done, somehow the scene size, according to the tool's own measurements, ballooned up to 11541MB!! How in the world did that happen?? And it doesn't seem to be a glitch in how it is calculating the value; the scene won't render in Iray unless I undo it. So it really is growing the scene somehow.

    Okay, let's try it with a different one, like this one. It actually does shrink: it goes from 5163MB to 2890MB. That's actually a considerable saving! But that still leaves the question on how the heck the other one grew. As for the rendering results?

    There are two things that are visibly affected. First, the lamps are shaded differently. I don't believe that's consequential. But one difference that *is* consequential is the fact that the flower clips are now more translucent. This is because I was doing something goofy with the flower clips' opacity map; I was reusing the glossy color map, but I used the Image Editor to set the Instance Color scale to 1.80. I don't remember why I did it that way; I probably shouldn't! But this information got thrown away in the process; I would consider that a bug. Instance Color and Instance Tiling definitely need to be restored on each map changed. I use Instance Tiling extensively on my characters' clothing so that bump map tiling can be independent of diffuse map tiling. It doesn't seem like it touched my characters' clothes, but if it had, it would have looked really bad! See the female character's skirt? The diffuse map is one giant circle, so it has to be 1x1. But the cloth detail--not that it can be seen in this one--needs to be 12x12 to look right. That's where Instance Tiling comes in handy.

    I'm very happy with the result on the second scene, but very confounded as to what happened on the first scene.

    I would also like the ability to choose where the modified texture maps get saved, the way Scene Optimizer currently offers.

    CRITICAL EDIT: Actually, speaking of Scene Optimizer, that actually shows what happened to the first scene. My female character's maps are ordinarily 3072x3072. I have 4096 ones, but she loads up with 3072 for most scenes. And Scene Optimizer had set all the maps in the environment to be 2048x2048. ...This utility erased all of those savings for some reason! It set my character's maps AND the environment maps back to 4096! Why did it pick bigger maps?! It was using smaller ones for a reason!! I'm hazarding a guess that it saw that there were textures with sizes at the end of the filename, as well as textures without the size at the end, which is a hallmark of how Scene Optimizer names its scaled textures, and just switched those back, and then... refused to do anything else.

    EDIT AGAIN: I originally had much harsher wording for what I thought about this problem. I do want to concede that it could be a bug. Perhaps it does go out of its way to avoid stepping on Scene Optimizer, and the codepath for that is not well-tested, so it's missing some steps. But the promotional material--mainly the video--spends a lot of time dunking on Scene Optimizer, so it really looks like it's intentionally spiting Scene Optimizer somehow. One thing I will concede is that Scene Optimizer's resizing algorithm is terrible, and the images usually look pretty bad. I have a script that redoes the resizes using ImageMagick. So maybe the angle here is "we can resize those images better"; that's a completely fair angle to take. ...But then it just... doesn't? And re-running the tool doesn't do it, either. It just refuses to shrink the scene any further.

    nye2025-04-05.png
    2160 x 2160 - 5M
    nye2025-04-06.png
    2160 x 2160 - 5M
    Post edited by TiZ on
  • But then I ran the LOD operation on the scene, and when it was done, somehow the scene size, according to the tool's own measurements, ballooned up to 11541MB!! How in the world did that happen?? And it doesn't seem to be a glitch in how it is calculating the value; the scene won't render in Iray unless I undo it. So it really is growing the scene somehow.

    The reason why it went from 4554MB to 11541MB is because the script searched for original textures first, undoing all the stuff Scene Optimizer did. 

    But this information got thrown away in the process; I would consider that a bug

    For now you can disable LOD for particular objects by selecting the props and clicking "Disable".

    It just refuses to shrink the scene any further.

    Use the Extra Downscale slider for further shrinking and adjust Clarity Radius to make all the objects within this radius have original textures.

  • TotteTotte Posts: 14,843

    TheMysteryIsThePoint said:

    Ha, @Totte is showing his battle scars and the wisdom he's gained from them.

    Heh, yea, learning by doing, and what doesn't kill you makes you stronger and pain is just weakness leaving the body, or so they say.

  • TiZTiZ Posts: 26
    edited February 12

    maguseveretus said:

    But then I ran the LOD operation on the scene, and when it was done, somehow the scene size, according to the tool's own measurements, ballooned up to 11541MB!! How in the world did that happen?? And it doesn't seem to be a glitch in how it is calculating the value; the scene won't render in Iray unless I undo it. So it really is growing the scene somehow.

    The reason why it went from 4554MB to 11541MB is because the script searched for original textures first, undoing all the stuff Scene Optimizer did. 

    Are you the author of the script? If so, what justification do you have for undoing the work that Scene Optimizer has already done? Especially without my consent? Like I said before, Scene Optimizer is the reason the scene was fitting in VRAM to begin with. If you want to re-resize the textures that Scene Optimizer made, that's probably a good idea, because most users probably don't have access to a script that conveniently redoes Scene Optimizer's resizing. But going back to original textures without consent and making the scene no longer fit in VRAM is an extremely confounding action to take.

    It's frustrating to know that I'll have to rename my character's textures to prevent you from changing them just because you think you know better.

    Post edited by TiZ on
  • AinmAinm Posts: 764

    Elor said:

    Ainm said:

    I've seen this product before somewhere. I wonder if I purchased it elsewhere already or if I remember it from a forum discussion somewhere. Anyone else recognise it? Don't want to buy it twice if I can help it, but can't for the life of me remember where I've seen it to check whether I have it already.

    It was sold on Renderhub.

    Thank you. That's indeed where I know it from!

  • TiZ said:

    maguseveretus said:

    But then I ran the LOD operation on the scene, and when it was done, somehow the scene size, according to the tool's own measurements, ballooned up to 11541MB!! How in the world did that happen?? And it doesn't seem to be a glitch in how it is calculating the value; the scene won't render in Iray unless I undo it. So it really is growing the scene somehow.

    The reason why it went from 4554MB to 11541MB is because the script searched for original textures first, undoing all the stuff Scene Optimizer did. 

    Are you the author of the script? If so, what justification do you have for undoing the work that Scene Optimizer has already done? Especially without my consent? Like I said before, Scene Optimizer is the reason the scene was fitting in VRAM to begin with. If you want to re-resize the textures that Scene Optimizer made, that's probably a good idea, because most users probably don't have access to a script that conveniently redoes Scene Optimizer's resizing. But going back to original textures without consent and making the scene no longer fit in VRAM is an extremely confounding action to take.

    It's frustrating to know that I'll have to rename my character's textures to prevent you from changing them just because you think you know better.

    This outrage is a bit unwarrented. Using two different scripts to do the same thing at the same time just isn't a good way to use either script. They are bound to step on each other in some way.

    You can always rerun Scene Optimizer on the original scene. Neither script modifies the original texture assets (I hope). And I hope you have a backup copy of the DUF file. I never run Scene Optimization on the original DUF. I save as and then run it. 

  • DasTriDasTri Posts: 172

     Using two different scripts to do the same thing at the same time just isn't a good way to use either script.

    Quoted for truth - as a developer and user.

     


     

Sign In or Register to comment.