Constantly Dropping to CPU
MelissaGT
Posts: 2,611
Hi there - I apologize for posting in this forum and not technical support...but I really don't get much response in the technical support space...
I'm trying to get a test render to go for the attached scene...just a small 1,000 x 1250px test render. It consists of 2 G8 figures at SubD 1, a bunch of dForce clothing that has all been turned into morphs so the actual dForce simulations are gone, and a pretty heavy background set with some instanced grass. It keeps dropping to CPU and I can't figure out why. I ran it through scene optimizer, but no help there. If I remove Hog and Barrel (the background set) entirely from the scene, the two figures will render on the GPU. However, if I leave Hog and Barrel just invisible, it drops to the CPU. If I try to set up a canvas of just the two figures by themselves, it still drops to the CPU. I thought the entire point of canvases was to be able to render out a heavier scene in pieces?
Earlier I was on the most recent Studio Driver. I just tried the most recent Game Ready driver from NVIDIA to the same result. I also just tried the DIM update for Daz 4.12. Still no dice. Instance Optimization is set to MEMORY. I have been shutting down and restarting DAZ with each failed attempt. And I have rebooted after my driver install. My system specs -
NVIDIA GTX 1080TI FTW3
32GB Corsair 3200 RAM
AMD RYZEN 7 1800x CPU
Here is the part of my log file from the render attempt. It keeps spitting on and on about OptiX optimazation...but I DO NOT have that checked off in the settings. And I don't get why it's erroring out at the "Unable to allocate 32000000 bytes from 25793331" ...that's only .032GB. I've shut down all other apps that I can. No browser windows open. I'm not sure what else I can shut down. Daz never used to give me grief like this with larger scenes...way larger scenes than this one. But with 4.12 Daz has been getting harder and harder to coax into using the GPU and I don't understand why.


Comments
*Update* I ran Hog and Barrel through Scene Optimizer again and dropped all the building elements from SubD 3 to SubD 1 along with deleting anything I could that wasn't in front of the camera. It rendered on the GPU with the two G8 figures invisible...hopefully I can squeeze them in.
That is just nuts that the builfing elements are even subd in the first place IMO.
Good luck with the render!
I was able to squeeze the figures in too, but yeah...I can see having the SubD for close-up shots, but this is all blurred out anyways, so it's not even noticable. But what really gets me is that this scene the way it was originally would not have been a problem on 4.10 or 4.11. I wish I could go back to an older version of Daz...but that ship has sailed. I get so furious when I think that I'm for all intents and purposes being pushed into having to drop $1400 on a RTX card just to be able to do a scene I would have been able to do no prob on the 1080TI six months ago.
I really doubt you could have ever rendered the background on a 1080ti if it was all SubD 3. That's a lot of geometry.
IIRC, RTX cards will actually use a bit more memory than the GTX cards to support the RTX functions. Even though it may only be a small difference, it's doubtful that upgrading to an RTX 2080ti would have any benefit in terms of memory consumption. I have an RTX 2070 with 8Gb, and have had more problems dropping to CPU than I did with a GTX 970M (6Gb). Of course I also "upgraded" to Win10 from Win7 with the new machine, and I'm sure that the extra memory reserved by Win10 isn't helping at all. Often after working on a scene for a while it will drop to CPU for not apparent reason. When this happens I save the scene, close DS, then restart and reload the scene, then it renders without issues. It sounds like your problem was more due to memory hungry content though.
It seems some vendors are making increasingly memory hungry assets though, so it's probably only going to get worse. I really wish Nvidia would up their game and give us out of core geometry and texture capabilities. Being limited by GPU memory is something that most (all?) other GPU based render engines fixed/addressed a long time ago.
Actually, I generate a whole bunch of canvasses with lights, specular layers, depth and selected figures/objects in one go. You can use canvasses for compositing multiple renders, but that's if you can wrangle shadows when doing your renders in multiple passes. I've not done scenes that way, though.
Yeah, After learning that GTX cards now have a memory penalty vs RTX cards pushed me to upgrade from my 8 Gb GTX card.
In a case like this, where there's separation between your characters and the background, maybe you could create an HDRI of the scenery and then do another render with your figures?
Really short (and simplified) answer: IRAY code was changing to bring in the new RTX series. Started out it had to do with the Optix warning that you are getting. It especially affected pre-RTX cards more so, with that OPtiX code change.
What is happening to your renderers has been discussed alot over in the Daz Studio Discussion - Pro & Beta threads off & on over the many previous last months going back I think it was to fall of 2019.
RTX cards were also affected. I had scenes that I used to render on GTX 1070 and it handled it better in key ways than my RTX 2080ti. Posted that longh ago in one of the suport threads.
The new SIMM feature has in the change-log that CPU fallback doesn't happen.
I can't say, cos am still on 4.12.0.86 with RTX2080ti. Did try the new latest Public Beta but didn't see any big enough improvements for my test scenes to make the migration...yet. But am still making assets & mods, so just maybe max large scenes are better now?
There have been other optimizations to help get the GTX's working better again. Including, supposedly this new Beta has a low memory option. May be of value to read through those threads? Including changelogs.
Sorry, Not being a GTX user, I haven't really paid attention beyond that info.
Did you by chance file a ticket with support and ask for an old version? You could then compare with latest Beta and the version that you remembered working best? That's how I would approach support if I was in your booties.
PS. Response in tech support thread is hit and miss. And some of it is "curiously" inconsistent from what I've seen over last 12 plus months. Hoomans are funny.

Anyway, as a frequent lurker, cos I don't want to make time to post, and deal with all "that", just wanted to say, thanks for sharing your excellent quality standout renders with fun themes.
Hmmm, maybe my "IIRC" has been rendered outdated by new imformation, and is now rendered useless. Thanks, nice to know that RTX cards might be more memory efficient now than GTX cards .... I feel better now.
It wasn't all SubD 3...it was I think the front of the hobbit hole, the door, and the front wall. But cutting those few down, along with deleting out all the grass instances off-camera seemed to do the trick. I must have been just over the threshold. *grumbles repeatedly about Daz not having an easy way to tell exactly how big a scene is*
I believe beauty canvases take shadow into account...at least they did with the few times I've tried them. However, I always got a black border around the layers, which was a whole other problem and is typically why I avoid layers to begin with (though threads I've read about that issue stated it was because of saving in .png vs .tif and I recently changed to .tif but haven't been able to test that yet). I considered the HDRI thing, or just rendering the background as an image on its own (honestly, I put so much work into my backgrounds I've considered putting together a background pack similar to what I've seeen other folks selling on that other site, haha)...but that would still require compositing in post and fiddling with proper shadows. That's not the sort of post-work that I'm very good at.
Thank you so much for the information! I'll have to download the beta and give that a shot. I had participated in the beta before 4.12 came out and I honestly forgot how to get the newest version. I'll have to research that. It would be wonderful if they are trying to bring back or at least increase support for the GTX cards because I know I used to be able to dump more into a scene than I can with this current iteration. Could I go buy a RTX card today? Yes...I keep checking Newegg and eyeballing the 2080TI FTW3 (I love the FTW models...those are my go-to). But then I keep reminding myself that the 3x cards should be coming out this year and I'd feel like a stunad if I rage-buy a card today only to see the new cards drop in two months. Rage-buy. Haha. I'm keeping that term for use later.
And I'm so glad you appreciate my stuff! Sometimes (and sometimes more than sometimes) lack of feedback gets me down...but then comments like this help me to keep posting and to stick with the themes that are close to my heart rather than joining the masses and changing what I do to fit in better.
Yeah, a Beauty canvas with no selection is just your whole image, but the canvas is 32-bit. With a selection, it's just that selection cut out of the image, with or without an alpha mask. If something is casting a shadow on it, it will still be there.
You could try doing the HDRI thing with ground, so you'd still get your shadows. Maybe try the technique with a simpler quick-rendering set with a couple of stand-in primitives to see how it goes.
This product is by far not perfect, but it does help giving you a better idea about your scene: https://www.daz3d.com/iray-memory-assistant
I am hoping there will be sales on the 2080TI next week when retail goes back where I am...
Thanks for the link! I have Scene Optimizer and that does most of the same thing and more. But that last promo image shows something that might be useful.
Also check Texture Compression settings in the advanced render settings tab. I changed mine to 8200 for a portrait I was working on, based on a tutorial, and it stuck for a subsequent scene I started. So I was always dropping to CPU until I remembered to change it back to this original setting.
Yeah, I had already done that. It's a setting I change regularly depending on what I'm working on.
Hadn't heard of FTW models. Just looksied. Quality with RGB looks similar to my Gigabyte Aorus (though didn't look too long - time thingy as always). The Aorus RTX 2080, silly thing is like 3lbs and has a metal leg stand going under unsupported corner to to help prevent card sag.
LOL, totally get "Rage-Buy." If Daz ever comes out with an animation package that works without lags moving from frame-2-frame am gonna "Rage-buy" that - regardless of cost. The All the other solutions require too much migration work or involve quality loss.
About Vid Cards. From what am hearing currently, seems Q4 2020 still seems reasonable for 3080Ti release. Then usual delays for this & that for us end-users, like IRAY implementation, and model availability.
https://nvidianews.nvidia.com/news/nvidia-announces-gtc-2020-keynote-with-ceo-jensen-huang-set-for-may-14
At this event coming up May 14th, hope is we hear the improvement of Ampere over Turing, but 3000 series details not expected yet. Some Rumored (leaks supposedly) specs about 3080Ti are almost too good to be true, so we'll see.
I should make more effort for feedback. Glad to hear that comment helps you share more of your amazing art. Just hard when you do for one artist when you really like their artwork & themes, but there are several other good artists & artwork. Then you feel like a goob for not saying about their work. Like am playing favorites. It's kinda tricky and almost takes math calculus to figure out how to do fairly over time. Maybe if I added a signature saying "Am here..wait..and now off and gone again. Sorry for missing anything.", lol. Not sure if that excuses arbitrary feedback.
I have a copy of 4.10, but don't yet have hog and barrel; you could ask for a copy of 4.10 from customer support.
I would wait.
I am, then again, I saved for a 1080ti, then decide on a 2080ti, then a Titan. I then ended up moving house, and spent way more than the titan card, but wanted to wait until had saved for the card again, but now - I'm waiting for the 3080ti, and likely the upgrade to the Titan. :)
LOL the 3080 is going to cost alot when its new right now some sell the eva 2080 for around $1500 CAD which is 1000 US or less. Next week In BC Canada things are strating to open up again & its a long weekend. I am hoping for a sale it would be cool to get it around 1300 to 1400 CAD range hoping anyhow.
I made some decent commish $$ plus I can resell my 1080ti for around $500 mark..
I wouldn't count on sales. Computer component sales have continued pretty steadily even during the quarantines, with so many people working from home a lot of places declared computer parts retailers essential businesses. Also the shutdown in China effected the component manufacturers supply chains. Most are just now getting up to speed again so supply is pretty tight.
Try checkin on amazon, I got a nice triple fan 2080 super for 1200 CAD.
As you suggested, I tried the latest beta and it's...amazing! I was able to spit out a 1,000px test render to 100% in 15min. And I can't believe I was able to leave the CPU checked in render settings and it actually used both while rendering! Daz has never performed so well for me...I really hope this is a good sign that they are truly working in compatibility for GTX again. Thank you again for your suggestion!
Same here im getting faster results with both ticked as well
Cool I am refering to 2080ti
I just checked amazon the 2080 supers are 1100 to 1200 TI's are the same price as my local PC store..
Edit my local PC store have one for $1389 CAD $240 off..Gonna stop by after work..
As long as it's not refurbished, that's a really good deal for a TI.
Dosent look like it the only thing and the ones for $300 more is 3 HDMI outputs vs 1 HDMI & DP I only need 1 for monitor.. so...as long as they have them or I will have one brought in if possible good thing it will work with my 3 yr old beast!
I wouldn't want only 1 HDMI...I don't *need* more but I would like to have the possibility of hooking up to my TV or another monitor if I want to. What brand are you looking at? Call me a beer snob, but I only buy EVGA GPU's, lol. I love their RMA support.
With 2 outputs you are good to go for a second monitor. You can always get an adapter to convert DP to whatever outputs your second monitor accepts. I have a second monitor on a DP to DVI adapter, myself. Now if only Daz could start supporting second monitors...
Thats what I thought I use dvi to HDMI cable on my current card. Eeven though I have HDMI cause I dont use sound from my monitor as far as needing it for TV I have my old 980 laptop as a media player..
If I am not mistaken, one cannot download beta as a standalone. In the meatime Mellisa has a valid point ticking CPU & GPU seems to giva a bit faster results...