Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2026 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2026 Daz Productions Inc. All Rights Reserved.
Comments
PNGs make more data to download, what I was saying is that they don't make any difference to the amount of data passed to Iray.
I think it's telling you how many surfaces it is used on, not how many copies are in memory. Each surface property gets a pointer to the map it is to use, the map is not part of the surface itself as I understand the process. Edit: you can demonstrate this - find a map that is used on two or more different surfaces, go to one of the surfaces and from the property's menu select Image Editor (not Layered Image Editor). Change the gamma. Find one of the other surfaces using the map and open the Image Editor for the map there - you will see the changed gamma value already in effect (this, incidentally, is why it's bad idea to use the same map for a colour property, which will want a gamma of 2.2 usually, and a cotnrol map, which will want a gamma of 1.0).
I have no problem with using PNG's for the texture maps but why make them so big? when a 4096x4096 map is 30MB+ and a 2048x2048 is 6MB that is a huge difference without a lot to show for it. It is excessive bloat and really not needed for the majority of studio users. That is what I am saying Richard. And I am still curious why daz is so against letting customers know what the file sizes are prior to purchase.
If that is the case then why does it give you the option to reduce the number of texture maps in order to reduce the load on your video card?
about png to jpg: I second the recommendation for IrfanView. Looks a bit crude but a very competent little program, including batch ops.
I'd be wary about transparency; png can do that but jpg doesn't. Some of the maps use transparency (especially hair stuff?).
about Car Girl Texture addon: Just looked at the page. The amount of outfits is astonishing. I've never seen anything with THAT many variation. 545 maps?? "60 dFCG Tank A Material Presets"? I think I'd rather applaud than criticise that. 3.3GB is still dang heavy for a texture pack, though..,
Edit: 4k vs 2k
The difference is clearly visible. IF you zoom in really close. For a portrait, 2k is probably all you need; full body shot, 1k.
Who's to say how close you can get?
I think for today's hardware, 4k is a good balance. If you have yesterday's hardware, well, batch resize...
But for those who do need the higher resolution theer would be no way of getting it, while higher resolution can always be downsampled.
Reducing the number of different maps will reduce the load, I thought the discussion was of mutliple uses of the same map (which is no worse than a single use).
...yeah I have a texture set Classic Suits: Shader Presets and Merchant Resource that is a whopping 5.973 GB total in seven files which I still have not downloaded due to intermittent download failures still occurring.
The only download bigger was the IBL Skies Bundle which was 6 separate products totalling 26.747 GB, the largest single one being just 200MB shy of the above Suit Shaders in size. Fortunately I have a 1.25 TB limit and as I don't download other big files, films, music, or video, I shouldn't have much trouble, just that it will take a long time.
So in other words daz would rather alienate the customers that can't affort top of the line rigs in order to playcate a small group of customers that can afford to have top of the line rigs and banks of hard drives to store 4GB sets that will let them render a rock at 1/2 an inch to look realistic? And that is a question not a statement or accusation.And honestly, how many people need or use that level of resolution?
I don't know where you got off track but it has always been about iray using multiple applications of a map for a given scene. I have only recently gotten a new computer that can run a newer version of studio, and I know quite a few people that can't afford new computers and are stuck using older rigs with video cards that don't ahve double digit of VRAM to handle these kind of sets.
It's not just a matter of how many need the full resolution over the full model - more likely people will want to zoom right in (think of all those fairy renders, for one). People who don't need the full image can use Scene Optimiser or just manually resize the images (as noted there are free tools which will do a batch resize).
Well, as I said above if it's the same file then it will take memory only once - Iray gets what DS sends it, and DS does not make multiple separate copies of maps that are used on multiple surfaces.
If anyone is using Daz studio 4.10 or earlier, Iray will load the maps for every surface its used on. However, in Daz Studio 4.11 and up, the maps are only loaded once. The exception to the rule, and this has always been the case, is if you use the same map in multiple channels on the surface such as using the bump map for gloss and top coat, will cause the map to be loaded again for each channel its used in.
alright I was intriged so I did some testing I loaded a texture onto every single image slot in the material tab for a cube. More than 20 slots using the same texture. hit render Iray loaded 3 textures so the texture was loaded 2 duplicate times but far less than the number of duplicates within the surface (one of the slots that added a duplicate was the normal map, I could not figure out what slot might be responsible for the other)
just to be thorough I then duplicated the cube twice so thats 3 cubes each with every sincle texture slot using the same image for good mesure I added a cone and applied the same material. Iray loaded 3 textures
after more testing anything plugged into a slot labeled "color" loads together Normal maps load by themselves (tbf if you're plugging a normal map into any other slot you have bigger issues) everything else loads together. So If you plug the same texture into Diffuse color, Bump, and Normal map Iray will load in that texture 3 times, but if you have the same texture in Diffuse color Translucency color and glossy color it will load once
and if you have the same texture on the diffuse on 50 different objects it will load once
This is obviously only one test and I could be wrong
also textures in slots like bump or traslucency weight have less memory load than slots in loke base color as they have only one chanel rather than three, this is probably why they load twice. Which also means even if the texture is loaded twice the memory requred is not doubled
one of my tests with Iray loading the texture once the memory hit was 16 mb but when the textures were loaded twice the memory hit only increased to 20mb
so more testing and it seems if you plug a 4k texture into a color slot its memory hit is 64 mb if you plug that same texture into a weight slot its memory hit is 16mb. This isn't really relevant but it is pretty nifty and I didnt know it before now
That's interesting... The 64MB:s in color slot is saying the texture file was 32bit file, was it?
The positive information comes from 16MB:s in weight slot, which suggests DS is reading the file only as 8bit that it needs - In that case the maps having (unnecessary) 24bit color depth doesn't have that much negative impact that I thought.
This is exactly what I have been talking about Richard, Not everyone can afford a new computer that meets the minimum requirements for newer versions of studio. These new sets are overloading older systems and instead of daz doing anything to help them are forcing them to find other means to make sets/characters/props work that they have already paid good money for. hank you Matty for giving a straight answer on this issue.
Come to think about it, that also explains loading a texture multiple times... If one slot accepts 24/32bit image and another only uses 8bit image, DS may internally convert the one in 8bit slot to another image.
I wrote this one a while ago at someone's request:
https://taosoft.dk/software/freeware/dazdim-ztc/
It can scan a zip (currently DAZ only) for textures, resize them, and then write a new zip which is identical except for the resized textures. This way you both get a smaller installer zip and smaller textures in your runtime. I took it down recently though because I discovered a bug in a 3rd party dll causing tiff files to be saved as png after resizing, this has been fixed now but I haven't got the update up for download yet, will see if I can get it done later today or tomorrow.
shadowhawk 1 is correct. The only people who need 8k+ textures are those doing Architectural Visualization, in which case they will have enthusiast made PCs to do so, which can't be said for your average 3D hobbyist. Not everyone here is rocking a WP Guru level rig.
Looking at the wireframe promo of that particular product along with the texture information of the store page, I can say with the utmost certainty that there was zero effort made to optimize that particular product before release. And to be clear, I'm not saying that set is "bad", on the contrary, it is detailed very well, too well for use outside of professional filmmaking. I really do wish Daz would require the inclusion of a highpoly count warning for such sets.
actually 4X reduction with V3D scene optimiser was all it needed
I can render it easily now
so there is that
Glad you got it working! Sooner or later, Daz might want to look into acquiring Scene Optimizer outright to include in the base software, given a good chunk of folks are increasingly using it (myself included!)
this is the only way they will be able to stop people returning stuff they cannot render unless they buy new graphics cards etc
What you are asking is that Daz deprive customers who have invested in newer hardware of access to higher quality assets. The thing here is that people who can afford newer hardware also likely have bigger budgets for content purchases. Daz is also looking to sell store content to users who do not use DS at all, and higher quality assets make Daz content more attractive to them, as well. Given how trivial it is for customers with less powerful systems and a little knowledge to downsample textures, either with free software #Irfanview, or using Scene Optimizer, I don't see that happening, nor should it happen, for reasons already mentioned in this thread.
The suggestion that Daz provide file size information for the maths-challenged has merit, although storage and bandwidth availability is expanding cheaply and quickly (everywhere except at Daz's chosen content delivery network), and will become less of an issue in the near future.
soooo if we are not psychic we are maths challenged???
Texture Haven, Turbosquid, CGaxis etc all must be catering to the mathematically challenged because they all list file sizes, polycounts where applicable and in most cases total number of texture maps and their sizes
I actually don't think this is too much to ask
saying something is high resolution is meaningless,
us mathematically challenged ones want figures and data
Daz does provide some sizing information, though it's true it isn't broken down into how many maps of a given size there are. I still say that size in Whateverbytes is not useful as it depends on the format of the data and compression settings, and so isn't any kind of reliable indicator or resource consumption at render time..
Most of the scenes which are "heavy", the heaviness is not due to "higher quality" but sloppy work with UV-Mapping, combined with multiple material maps with unreasonable image sizes and color depths (example 8192x8192x24bit metallicity map, that's just one colour=pure black)
Attached is a real life example of UV-map that was used for just one surface, and the surface had 5 maps attached to it, all 8192x8192px - If you even downgraded the texture down to 4k, you couldn't use them anymore because the vertical resolution of the used UV-cloud was already as small as it could be. As it was, the combined wasted space that did nothing to improve the quality, was over 500MB:s and that was just one surface.
Well yeah. Some PAs have "challenges", too.
1. I don't recall that I said put a limit file sizes, however just how much of a difference would it make between using a 4096x4096 png file and a 2048x2048 png file? Asking PA's to offer dual downloads wouldn't be considered a reasonable solution either, but customers do have a reasonable right to know the size of the products they are buying. Every item offered at Renderosity has the number of files and their sizes as well as the read me viewable prior to purchase, so what is wrong with requesting that daz do the same? Some of this could also be alleviated if daz were to realize that all of their customers don't use top of the line systems and incorperate some kind of native file reduction script into studio.
2. As I have said already, I live and work on a Department of State run base in Iraq and currently pay $70 a month for 512kbps internet so any download over about 30mb's takes upward of an hour. I could get a 1mbps download plan but that would run me $140 a month. I am sure that there are a lot of others in the same boat I am in that they do not have economical access to high speed internet that would cut downloads of this size down to minutes instead of days. I also have to try and limit the size of my content folder as I am on a 12 hour evacuation plan and trying to pack a large number of hard drives in to a bag that can't excede 20 pounds plays a factor for me as well. Although this last part is fairly unique to me and not others.
3. How is it being math challenged when you don't have sufficient information to judge whether a given set will run on your system or not?
The download size does give you some indication, usually if the DL size is around 250MB there are no problems, but when it goes up to 2GB+ one could check beforehand if there is a reason for it, and it would also serve those with slower and/or pricier connections.
The vast majority of download size and VRAM load comes from image maps, right? Since they don't compress any further, I'd say the total download size is a good indicator for what to expect.
Download size and vertex count would be MUCH better than what we have today (barely anything). I'm lucky to have an up-to-date rig, but still I care that resources are reasonable for what the product brings to the table.
Btw. another thing that tips the balance is that a PA wants to sell a new item not only today, but hopefully for years. In 2023 the rich kids will be running their RTX 6090 on an 8k display, and everyone else will have today's high end stuff. 1k maps are going to look pretty old by then...
It's difficult. Except for "provide clear information", that never hurts.
Why does this surprise you? I have felt this way ever since they started releasinng products that are Iray only, requiring highend Nvidia graphics cards.
Simple solution, if the download is too big or the product has too many polys, return it for a refund.
I will say that this bothers me far more than file DL size, although I think the total file size should be stated for people who need to know. I have good HW but this is what causes me headaches. I'll buy some environment, or other asset, and plan out a bunch of scenes using it. Then I start rendering and it drops to CPU. I cannot have 10 or 12 renders run to an acceptable level on CPU that would take days. So I stop and have to optimize each scene, no big deal that's a half hour or so if it works the first time. But there was one environment, which I will not name, that I had to cut every map x4 to get it to fit on my 1080ti. Upon investigation it had 8k maps everywhere and the UV maps were just as you described. I tried resizing one and it blew it up.
This is why Daz QA bugs me. There's no way that scene got loaded onto a PC and rendered by someone with a reasonable PC just as a sanity check because it required a Titan or quadro to render on GPU.
To bad you won't name it as PAs need to know so they can improve. I realize it's a fine line between constructive criticism and just plain being a jerk and complaining, but if DAZ isn't going to catch or care about these things so they can inform the PA, then it's up to the users.
Nonsense, you're putting words in peoples mouths. Having PA's provide lower resolution textures doesn't deprive customers who have invested in newer hardware of anything, it provides for those who aren't chasing after every new hardware release. And as for those people who can afford newer hardware, that doesn't necessarily correlate to having bigger budgets for buying every little thing on Daz' storefront, and even if it did, unless Daz starts operating an auction house, that doesn't translate to more sales. You're right about one thing though, Daz is looking to sell store content to users who don't use DS at all, and higher quality assets does make Daz content more attractive overall, however the inability of some PA's to provide reasonable texture maps, instancing or even some basic retopology is not indicative of quality, quite the contrary in fact. And lets not forget that Daz has been placing their interactive licenses on sale, you know the sort of customers that buy those? Game developers. You know what game developers need to worry about? The size of their assets, from poly count to texture data. Especially texture data. The data size of a texture on disk isn't the same as when loaded in memory. Take the Old Crone's Home that was mentioned earlier for example. It has a whopping 169 4K maps, that's 64mb of uncompressed data per map at 169 maps, which is somewhere between 8.5 to over 10gb (depending on alpha) of uncompressed texture data alone, and that's not counting the other 13 8k maps! But what do I know, I'm just one of those maths-challenged folk, right?
Exactly, the large textures are not a problem by themselves if they can be scaled down, but sloppy UV-mapping as in my example or having (for example) ten thousand bolt heads (1/2") that nobody pays any attention to, with each having their own UV-cloud and those having 5 textures/maps at 8k24bit...
These kind of problems take down even the best of the machines without giving any benefits in quality.